Oct 09 13:50:51 crc systemd[1]: Starting Kubernetes Kubelet... Oct 09 13:50:51 crc restorecon[4662]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:51 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 13:50:52 crc restorecon[4662]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 09 13:50:53 crc kubenswrapper[4902]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.243105 4902 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246385 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246401 4902 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246406 4902 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246423 4902 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246428 4902 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246432 4902 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246436 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246440 4902 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246444 4902 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246448 4902 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246452 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246456 4902 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246460 4902 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246464 4902 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246468 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246472 4902 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246476 4902 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246480 4902 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246485 4902 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246490 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246495 4902 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246499 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246503 4902 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246507 4902 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246511 4902 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246515 4902 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246520 4902 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246525 4902 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246530 4902 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246534 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246539 4902 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246543 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246547 4902 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246551 4902 feature_gate.go:330] unrecognized feature gate: Example Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246555 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246559 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246563 4902 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246567 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246573 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246577 4902 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246582 4902 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246586 4902 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246591 4902 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246595 4902 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246599 4902 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246603 4902 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246607 4902 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246611 4902 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246615 4902 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246619 4902 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246623 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246627 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246631 4902 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246635 4902 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246638 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246642 4902 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246646 4902 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246650 4902 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246654 4902 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246658 4902 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246663 4902 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246667 4902 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246672 4902 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246676 4902 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246679 4902 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246683 4902 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246687 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246691 4902 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246696 4902 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246700 4902 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.246706 4902 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249253 4902 flags.go:64] FLAG: --address="0.0.0.0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249299 4902 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249315 4902 flags.go:64] FLAG: --anonymous-auth="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249327 4902 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249339 4902 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249348 4902 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249360 4902 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249439 4902 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249463 4902 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249476 4902 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249489 4902 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249501 4902 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249514 4902 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249524 4902 flags.go:64] FLAG: --cgroup-root="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249532 4902 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249541 4902 flags.go:64] FLAG: --client-ca-file="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249550 4902 flags.go:64] FLAG: --cloud-config="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249559 4902 flags.go:64] FLAG: --cloud-provider="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249567 4902 flags.go:64] FLAG: --cluster-dns="[]" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249582 4902 flags.go:64] FLAG: --cluster-domain="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249591 4902 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249600 4902 flags.go:64] FLAG: --config-dir="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249609 4902 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249618 4902 flags.go:64] FLAG: --container-log-max-files="5" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249631 4902 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249640 4902 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249649 4902 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249662 4902 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249686 4902 flags.go:64] FLAG: --contention-profiling="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249702 4902 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249713 4902 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249726 4902 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249737 4902 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249754 4902 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249766 4902 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249778 4902 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249789 4902 flags.go:64] FLAG: --enable-load-reader="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249804 4902 flags.go:64] FLAG: --enable-server="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249816 4902 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249830 4902 flags.go:64] FLAG: --event-burst="100" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249842 4902 flags.go:64] FLAG: --event-qps="50" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249853 4902 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249865 4902 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249877 4902 flags.go:64] FLAG: --eviction-hard="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249893 4902 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249904 4902 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249917 4902 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249928 4902 flags.go:64] FLAG: --eviction-soft="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249941 4902 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249952 4902 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249963 4902 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249975 4902 flags.go:64] FLAG: --experimental-mounter-path="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249986 4902 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.249998 4902 flags.go:64] FLAG: --fail-swap-on="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250009 4902 flags.go:64] FLAG: --feature-gates="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250022 4902 flags.go:64] FLAG: --file-check-frequency="20s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250034 4902 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250046 4902 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250058 4902 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250068 4902 flags.go:64] FLAG: --healthz-port="10248" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250078 4902 flags.go:64] FLAG: --help="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250087 4902 flags.go:64] FLAG: --hostname-override="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250095 4902 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250104 4902 flags.go:64] FLAG: --http-check-frequency="20s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250114 4902 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250122 4902 flags.go:64] FLAG: --image-credential-provider-config="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250132 4902 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250141 4902 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250150 4902 flags.go:64] FLAG: --image-service-endpoint="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250159 4902 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250167 4902 flags.go:64] FLAG: --kube-api-burst="100" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250176 4902 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250186 4902 flags.go:64] FLAG: --kube-api-qps="50" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250206 4902 flags.go:64] FLAG: --kube-reserved="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250218 4902 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250229 4902 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250241 4902 flags.go:64] FLAG: --kubelet-cgroups="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250251 4902 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250263 4902 flags.go:64] FLAG: --lock-file="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250272 4902 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250281 4902 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250290 4902 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250304 4902 flags.go:64] FLAG: --log-json-split-stream="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250313 4902 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250322 4902 flags.go:64] FLAG: --log-text-split-stream="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250331 4902 flags.go:64] FLAG: --logging-format="text" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250340 4902 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250349 4902 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250358 4902 flags.go:64] FLAG: --manifest-url="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250366 4902 flags.go:64] FLAG: --manifest-url-header="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250378 4902 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250387 4902 flags.go:64] FLAG: --max-open-files="1000000" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250398 4902 flags.go:64] FLAG: --max-pods="110" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250440 4902 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250450 4902 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250459 4902 flags.go:64] FLAG: --memory-manager-policy="None" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250469 4902 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250481 4902 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250490 4902 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250499 4902 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250519 4902 flags.go:64] FLAG: --node-status-max-images="50" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250528 4902 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250538 4902 flags.go:64] FLAG: --oom-score-adj="-999" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250547 4902 flags.go:64] FLAG: --pod-cidr="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250555 4902 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250568 4902 flags.go:64] FLAG: --pod-manifest-path="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250577 4902 flags.go:64] FLAG: --pod-max-pids="-1" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250586 4902 flags.go:64] FLAG: --pods-per-core="0" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250595 4902 flags.go:64] FLAG: --port="10250" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250606 4902 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250615 4902 flags.go:64] FLAG: --provider-id="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250624 4902 flags.go:64] FLAG: --qos-reserved="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250632 4902 flags.go:64] FLAG: --read-only-port="10255" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250641 4902 flags.go:64] FLAG: --register-node="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250650 4902 flags.go:64] FLAG: --register-schedulable="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250658 4902 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250673 4902 flags.go:64] FLAG: --registry-burst="10" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250682 4902 flags.go:64] FLAG: --registry-qps="5" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250691 4902 flags.go:64] FLAG: --reserved-cpus="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250699 4902 flags.go:64] FLAG: --reserved-memory="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250710 4902 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250720 4902 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250729 4902 flags.go:64] FLAG: --rotate-certificates="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250738 4902 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250747 4902 flags.go:64] FLAG: --runonce="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250755 4902 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250764 4902 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250774 4902 flags.go:64] FLAG: --seccomp-default="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250783 4902 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250792 4902 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250801 4902 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250810 4902 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250819 4902 flags.go:64] FLAG: --storage-driver-password="root" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250828 4902 flags.go:64] FLAG: --storage-driver-secure="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250837 4902 flags.go:64] FLAG: --storage-driver-table="stats" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250846 4902 flags.go:64] FLAG: --storage-driver-user="root" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250855 4902 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250864 4902 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250873 4902 flags.go:64] FLAG: --system-cgroups="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250881 4902 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250895 4902 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250903 4902 flags.go:64] FLAG: --tls-cert-file="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250912 4902 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250923 4902 flags.go:64] FLAG: --tls-min-version="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250932 4902 flags.go:64] FLAG: --tls-private-key-file="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250942 4902 flags.go:64] FLAG: --topology-manager-policy="none" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250950 4902 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250960 4902 flags.go:64] FLAG: --topology-manager-scope="container" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250968 4902 flags.go:64] FLAG: --v="2" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250979 4902 flags.go:64] FLAG: --version="false" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.250991 4902 flags.go:64] FLAG: --vmodule="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.251001 4902 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.251010 4902 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251247 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251258 4902 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251266 4902 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251275 4902 feature_gate.go:330] unrecognized feature gate: Example Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251285 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251294 4902 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251302 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251310 4902 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251318 4902 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251326 4902 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251335 4902 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251342 4902 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251350 4902 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251358 4902 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251366 4902 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251374 4902 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251381 4902 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251389 4902 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251397 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251432 4902 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251443 4902 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251454 4902 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251463 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251473 4902 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251482 4902 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251491 4902 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251500 4902 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251508 4902 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251517 4902 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251524 4902 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251532 4902 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251542 4902 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251552 4902 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251560 4902 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251568 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251576 4902 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251584 4902 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251592 4902 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251600 4902 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251608 4902 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251616 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251624 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251633 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251641 4902 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251650 4902 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251657 4902 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251665 4902 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251673 4902 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251680 4902 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251688 4902 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251696 4902 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251703 4902 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251711 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251719 4902 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251727 4902 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251734 4902 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251741 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251750 4902 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251757 4902 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251765 4902 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251772 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251780 4902 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251788 4902 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251796 4902 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251804 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251812 4902 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251819 4902 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251827 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251836 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251844 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.251852 4902 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.251875 4902 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.262822 4902 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.262870 4902 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262958 4902 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262966 4902 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262971 4902 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262975 4902 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262980 4902 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262985 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262990 4902 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262994 4902 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.262998 4902 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263002 4902 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263007 4902 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263011 4902 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263015 4902 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263019 4902 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263024 4902 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263029 4902 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263034 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263038 4902 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263042 4902 feature_gate.go:330] unrecognized feature gate: Example Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263046 4902 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263050 4902 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263055 4902 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263059 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263063 4902 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263067 4902 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263072 4902 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263076 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263080 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263085 4902 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263089 4902 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263094 4902 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263100 4902 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263107 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263112 4902 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263119 4902 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263128 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263134 4902 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263139 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263143 4902 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263148 4902 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263153 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263157 4902 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263163 4902 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263169 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263174 4902 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263178 4902 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263183 4902 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263188 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263192 4902 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263197 4902 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263201 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263206 4902 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263210 4902 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263215 4902 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263219 4902 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263223 4902 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263228 4902 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263232 4902 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263237 4902 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263241 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263246 4902 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263252 4902 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263257 4902 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263263 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263267 4902 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263272 4902 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263279 4902 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263286 4902 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263292 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263298 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263302 4902 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.263312 4902 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263502 4902 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263513 4902 feature_gate.go:330] unrecognized feature gate: Example Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263519 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263524 4902 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263529 4902 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263533 4902 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263538 4902 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263542 4902 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263547 4902 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263552 4902 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263558 4902 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263585 4902 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263591 4902 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263596 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263603 4902 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263608 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263615 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263620 4902 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263625 4902 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263630 4902 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263636 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263641 4902 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263648 4902 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263654 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263661 4902 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263666 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263671 4902 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263675 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263680 4902 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263684 4902 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263689 4902 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263694 4902 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263698 4902 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263703 4902 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263707 4902 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263712 4902 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263717 4902 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263722 4902 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263727 4902 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263731 4902 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263735 4902 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263740 4902 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263746 4902 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263752 4902 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263757 4902 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263761 4902 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263766 4902 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263770 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263775 4902 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263779 4902 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263783 4902 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263789 4902 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263794 4902 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263799 4902 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263805 4902 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263810 4902 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263814 4902 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263819 4902 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263824 4902 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263830 4902 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263836 4902 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263840 4902 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263845 4902 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263850 4902 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263854 4902 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263859 4902 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263864 4902 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263869 4902 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263873 4902 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263878 4902 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.263883 4902 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.263890 4902 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.264098 4902 server.go:940] "Client rotation is on, will bootstrap in background" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.268942 4902 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.269054 4902 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.270708 4902 server.go:997] "Starting client certificate rotation" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.270732 4902 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.272627 4902 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-30 02:39:06.459029653 +0000 UTC Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.272713 4902 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1236h48m13.186318513s for next certificate rotation Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.293140 4902 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.296039 4902 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.315701 4902 log.go:25] "Validated CRI v1 runtime API" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.353559 4902 log.go:25] "Validated CRI v1 image API" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.355816 4902 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.365244 4902 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-09-13-46-07-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.365301 4902 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.394684 4902 manager.go:217] Machine: {Timestamp:2025-10-09 13:50:53.390622307 +0000 UTC m=+0.588481391 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:cdf8ab83-9686-4956-8863-5dee71665ef5 BootID:0f81a8ab-8cc3-4aca-b5f1-97aedf97ca77 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:74:59:4f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:74:59:4f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:90:40:f9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3b:c2:4c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:68:77:7b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:71:95:49 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:b1:2d:04:13:a5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:12:31:04:26:87:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.395372 4902 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.395515 4902 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.396288 4902 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.396479 4902 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.396543 4902 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.398061 4902 topology_manager.go:138] "Creating topology manager with none policy" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.398089 4902 container_manager_linux.go:303] "Creating device plugin manager" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.398660 4902 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.398726 4902 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.399767 4902 state_mem.go:36] "Initialized new in-memory state store" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.399916 4902 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.404332 4902 kubelet.go:418] "Attempting to sync node with API server" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.404380 4902 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.404521 4902 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.404555 4902 kubelet.go:324] "Adding apiserver pod source" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.404584 4902 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.410941 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.411176 4902 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.411169 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.410950 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.411314 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.412402 4902 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.414871 4902 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416792 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416840 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416865 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416900 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416934 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416956 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416971 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.416994 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.417012 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.417028 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.417048 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.417066 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.419376 4902 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.419848 4902 server.go:1280] "Started kubelet" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.420430 4902 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.420701 4902 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.421286 4902 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 09 13:50:53 crc systemd[1]: Started Kubernetes Kubelet. Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.421634 4902 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.423270 4902 server.go:460] "Adding debug handlers to kubelet server" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.427378 4902 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.427505 4902 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.427712 4902 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.427796 4902 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.428111 4902 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.428380 4902 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:19:21.327148072 +0000 UTC Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.428472 4902 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1339h28m27.898679404s for next certificate rotation Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.428522 4902 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.430073 4902 factory.go:55] Registering systemd factory Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.430512 4902 factory.go:221] Registration of the systemd container factory successfully Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.430558 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.430750 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.430953 4902 factory.go:153] Registering CRI-O factory Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.430968 4902 factory.go:221] Registration of the crio container factory successfully Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.431041 4902 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.431080 4902 factory.go:103] Registering Raw factory Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.431105 4902 manager.go:1196] Started watching for new ooms in manager Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.431839 4902 manager.go:319] Starting recovery of all containers Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.429350 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.430558 4902 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186cd6e9a76efc12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 13:50:53.419813906 +0000 UTC m=+0.617672980,LastTimestamp:2025-10-09 13:50:53.419813906 +0000 UTC m=+0.617672980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.461655 4902 manager.go:324] Recovery completed Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.463205 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465685 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465716 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465736 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465802 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465823 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465841 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465864 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465888 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465909 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465929 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465948 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465967 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.465990 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466008 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466029 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466070 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466097 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466173 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466199 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466224 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466251 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466277 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466304 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466352 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466380 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.466450 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468553 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468631 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468658 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468678 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468698 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468751 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468769 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468789 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468845 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468865 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468884 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468904 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.468923 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469023 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469042 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469063 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469081 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469120 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469138 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469154 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469173 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469208 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469229 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469263 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469312 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469370 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469392 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.469467 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472039 4902 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472122 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472157 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472187 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472211 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472230 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472250 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472288 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472315 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472359 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472386 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472443 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472471 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472496 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472521 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472597 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472658 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472790 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472824 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472850 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472921 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472949 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.472972 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473002 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473027 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473089 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473142 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473192 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473222 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473247 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473313 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473396 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473488 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473542 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473561 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473580 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473598 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473617 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473636 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473654 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473673 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473712 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473730 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473748 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473835 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473855 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473875 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473894 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473912 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.473946 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474028 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474074 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474148 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474209 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474239 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474252 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474260 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474447 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474517 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474533 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474545 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474557 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474568 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474578 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474616 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474627 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474638 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474650 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474660 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474670 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474679 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474705 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474725 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474734 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474771 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474781 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474818 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474828 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474837 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474845 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474866 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474878 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474890 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474902 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474912 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474952 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474964 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.474995 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475021 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475033 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475072 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475097 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475108 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475120 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475131 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475143 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475170 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475180 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475188 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475197 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475234 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475244 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475253 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475265 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475309 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475320 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475342 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475371 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475381 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475390 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475400 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475419 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475443 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475452 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475463 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475472 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475481 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475508 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475543 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475553 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475596 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475605 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475614 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475623 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475632 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475650 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475659 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475668 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475714 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475730 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475739 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475748 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475785 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475830 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475849 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475864 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475882 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475893 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475907 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475921 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475932 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475953 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475964 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475973 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475987 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.475997 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476005 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476015 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476027 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476045 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476079 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476090 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476118 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476126 4902 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476134 4902 reconstruct.go:97] "Volume reconstruction finished" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.476142 4902 reconciler.go:26] "Reconciler: start to sync state" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.477222 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.477256 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.477292 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.478305 4902 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.478314 4902 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.478341 4902 state_mem.go:36] "Initialized new in-memory state store" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.499844 4902 policy_none.go:49] "None policy: Start" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.503420 4902 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.503461 4902 state_mem.go:35] "Initializing new in-memory state store" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.508742 4902 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.510771 4902 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.510804 4902 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.511585 4902 kubelet.go:2335] "Starting kubelet main sync loop" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.511634 4902 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.514826 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.515498 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.530088 4902 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.555342 4902 manager.go:334] "Starting Device Plugin manager" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.555490 4902 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.555506 4902 server.go:79] "Starting device plugin registration server" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.555965 4902 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.555985 4902 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.556231 4902 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.556322 4902 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.556337 4902 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.566880 4902 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.612482 4902 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.612614 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.613560 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.613596 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.613608 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.613750 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614053 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614108 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614571 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614618 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614630 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614792 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614902 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614950 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.614968 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.615010 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.615025 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616105 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616176 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616193 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616318 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616364 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616377 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616490 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616583 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.616623 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617610 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617643 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617653 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617743 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617789 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617804 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.617998 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.618090 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.618134 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.619752 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.619778 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.619790 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.619971 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620006 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620527 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620558 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620573 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620762 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620799 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.620814 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.635130 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.656109 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.657731 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.657799 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.657818 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.657864 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.659229 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.677861 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.677972 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678035 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678131 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678181 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678250 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678310 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678341 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678395 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678467 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678510 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678544 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678578 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678605 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.678636 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.780679 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.780366 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.780946 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781027 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781345 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781516 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781572 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781642 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781607 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781770 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781816 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781846 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781858 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781866 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781906 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781915 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781942 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781669 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782014 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782019 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782049 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782077 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.781990 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782079 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782112 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782122 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782155 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782051 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.782296 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.859471 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.860968 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.861042 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.861080 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.861162 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:50:53 crc kubenswrapper[4902]: E1009 13:50:53.861929 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.933556 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.958969 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.963031 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.982268 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: I1009 13:50:53.988965 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 13:50:53 crc kubenswrapper[4902]: W1009 13:50:53.995449 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-993d829c2bb599e52ca97420fc9f60bd36d3b033a592a5bcdeec2e0646899e3f WatchSource:0}: Error finding container 993d829c2bb599e52ca97420fc9f60bd36d3b033a592a5bcdeec2e0646899e3f: Status 404 returned error can't find the container with id 993d829c2bb599e52ca97420fc9f60bd36d3b033a592a5bcdeec2e0646899e3f Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.036444 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.262225 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.264038 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.264092 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.264105 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.264134 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.264671 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 09 13:50:54 crc kubenswrapper[4902]: W1009 13:50:54.283570 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.283677 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.421869 4902 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:54 crc kubenswrapper[4902]: W1009 13:50:54.436467 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.436537 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.517158 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f3c785ec6fcc1743204d4be95b7fb73929f09e3c7ca896966ba0f4b5eaca269c"} Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.518516 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"41b164ccf89a04a8077683095833186515ee061427f0a406f73cb6284c4c84ea"} Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.523006 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2d48faccbd2ca87fd6890ea0f55ec3cb07c842018a153c77c16e67f309c77fe0"} Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.524286 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3324cf1c0cddf746a84b4d834616daf7255099486feecd0425c019486c2be020"} Oct 09 13:50:54 crc kubenswrapper[4902]: I1009 13:50:54.525109 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"993d829c2bb599e52ca97420fc9f60bd36d3b033a592a5bcdeec2e0646899e3f"} Oct 09 13:50:54 crc kubenswrapper[4902]: W1009 13:50:54.549703 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.549844 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:54 crc kubenswrapper[4902]: W1009 13:50:54.668670 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.668853 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:54 crc kubenswrapper[4902]: E1009 13:50:54.838231 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.065003 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.066569 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.066652 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.066671 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.066710 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:50:55 crc kubenswrapper[4902]: E1009 13:50:55.067376 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.421846 4902 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.531779 4902 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83" exitCode=0 Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.531906 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.532069 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.534995 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.535053 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.535067 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.536936 4902 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="98bc8a281fa87260bb1c9c241491128c2a8a9a044771965f840006de53e5658f" exitCode=0 Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.537035 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"98bc8a281fa87260bb1c9c241491128c2a8a9a044771965f840006de53e5658f"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.537092 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.538622 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.538673 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.538692 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.540318 4902 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="336328529b573fa620d89dd9b35876c441780b4ad399065ff7be3cbfd2f41e66" exitCode=0 Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.540608 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.540928 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"336328529b573fa620d89dd9b35876c441780b4ad399065ff7be3cbfd2f41e66"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.544016 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.544063 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.544075 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.547624 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.547675 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.547689 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.547704 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.547690 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.549622 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.549678 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.549693 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.551357 4902 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="92329c4c1ce95b34b952697f3a48a0da94ceff26f0bf610b923b00b4e6c632cf" exitCode=0 Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.551472 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.551466 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"92329c4c1ce95b34b952697f3a48a0da94ceff26f0bf610b923b00b4e6c632cf"} Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.552676 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.552698 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.552708 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.554679 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.555452 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.555479 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:55 crc kubenswrapper[4902]: I1009 13:50:55.555489 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: W1009 13:50:56.376787 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:56 crc kubenswrapper[4902]: E1009 13:50:56.376891 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:56 crc kubenswrapper[4902]: W1009 13:50:56.406574 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:56 crc kubenswrapper[4902]: E1009 13:50:56.406696 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.421455 4902 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:56 crc kubenswrapper[4902]: E1009 13:50:56.439610 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="3.2s" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.563770 4902 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b" exitCode=0 Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.563813 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.563950 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.565006 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.565036 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.565047 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.567467 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dc21f0198c55aec94875d28a30260d70eb1d824135b17c7181a54d0eee5aa696"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.567562 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.568269 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.568290 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.568299 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.572873 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"82ea3fc83562759194757ca7a37b00885ad1b51de863f3c7d8dca1dd249dcd57"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.572905 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6080ba67fd4a5766831c0dad35ff01eff3f7bee94822170a4fb9cc578f1719f"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.572915 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"71c442146925badd786d4ea43167eeb0a6475cd3be09a32f512dc4b5a691c7e5"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.572982 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.574950 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.574975 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.574983 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.582460 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"068eb68236db7e1816c9050c84c73b443303d5b33aeb3903d535713e206ea81b"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.582499 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ef6f7109f7fbc85c4ca76ab10ab6bd6d1e468603d4a9e574860cf87c9131516"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.582512 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33afc13cb49c16b4bb2da343a51e2f2a2bea347937b5406853bc668f5e7bebf4"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.582521 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.582526 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee1d2a158f73255d3f748df9054d82134fac0cf488214dd4f6356a2f1735fa98"} Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.583707 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.583767 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.583783 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.668088 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.673742 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.673788 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.673806 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:56 crc kubenswrapper[4902]: I1009 13:50:56.673833 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:50:56 crc kubenswrapper[4902]: E1009 13:50:56.674353 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 09 13:50:56 crc kubenswrapper[4902]: W1009 13:50:56.903714 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:56 crc kubenswrapper[4902]: E1009 13:50:56.903804 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:57 crc kubenswrapper[4902]: W1009 13:50:57.092020 4902 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 09 13:50:57 crc kubenswrapper[4902]: E1009 13:50:57.092139 4902 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.586698 4902 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484" exitCode=0 Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.586773 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484"} Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.586792 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.587955 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.587990 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.588002 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591119 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591134 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591150 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591152 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9"} Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591300 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.591977 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592005 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592015 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592660 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592685 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592716 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592734 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592801 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:57 crc kubenswrapper[4902]: I1009 13:50:57.592816 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600113 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc"} Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600172 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709"} Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600193 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef"} Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600211 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20"} Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600219 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.600247 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.601124 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.601162 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:58 crc kubenswrapper[4902]: I1009 13:50:58.601176 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.355436 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.609877 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7"} Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.610012 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.610136 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.611250 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.611329 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.611347 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.612377 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.612448 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.612466 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.874959 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.876944 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.877008 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.877033 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:50:59 crc kubenswrapper[4902]: I1009 13:50:59.877081 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.058770 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.177455 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.177634 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.178795 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.178826 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.178835 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.613273 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.613297 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.614937 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.615006 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.615023 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.615279 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.615331 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:00 crc kubenswrapper[4902]: I1009 13:51:00.615345 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.615178 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.616283 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.616317 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.616325 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.869481 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.869652 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.871116 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.871184 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:01 crc kubenswrapper[4902]: I1009 13:51:01.871203 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.086351 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.511235 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.511540 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.512812 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.512875 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.512896 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.617214 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.617945 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.617985 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.618001 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.678525 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.678819 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.679878 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.679907 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:02 crc kubenswrapper[4902]: I1009 13:51:02.679916 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:03 crc kubenswrapper[4902]: I1009 13:51:03.359536 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 09 13:51:03 crc kubenswrapper[4902]: E1009 13:51:03.567099 4902 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 13:51:03 crc kubenswrapper[4902]: I1009 13:51:03.619865 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:03 crc kubenswrapper[4902]: I1009 13:51:03.621127 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:03 crc kubenswrapper[4902]: I1009 13:51:03.621171 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:03 crc kubenswrapper[4902]: I1009 13:51:03.621181 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.191736 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.192345 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.194122 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.194179 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.194191 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.197754 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.623054 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.624482 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.624555 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.624566 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:04 crc kubenswrapper[4902]: I1009 13:51:04.629611 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.086688 4902 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.086806 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.628527 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.629801 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.629869 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:05 crc kubenswrapper[4902]: I1009 13:51:05.629892 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.422784 4902 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.635485 4902 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.635585 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.636176 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.638013 4902 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9" exitCode=255 Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.638065 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9"} Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.638209 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.639067 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.639110 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.639122 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.639749 4902 scope.go:117] "RemoveContainer" containerID="7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9" Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.648711 4902 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 13:51:07 crc kubenswrapper[4902]: I1009 13:51:07.648789 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.644299 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.646820 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0"} Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.646999 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.648224 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.648308 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:08 crc kubenswrapper[4902]: I1009 13:51:08.648336 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.364854 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.650967 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.651370 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.656045 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.657318 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.657355 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:09 crc kubenswrapper[4902]: I1009 13:51:09.661819 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:10 crc kubenswrapper[4902]: I1009 13:51:10.653865 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:10 crc kubenswrapper[4902]: I1009 13:51:10.655115 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:10 crc kubenswrapper[4902]: I1009 13:51:10.655225 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:10 crc kubenswrapper[4902]: I1009 13:51:10.655299 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:11 crc kubenswrapper[4902]: I1009 13:51:11.657732 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:11 crc kubenswrapper[4902]: I1009 13:51:11.658929 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:11 crc kubenswrapper[4902]: I1009 13:51:11.658966 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:11 crc kubenswrapper[4902]: I1009 13:51:11.658980 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:12 crc kubenswrapper[4902]: E1009 13:51:12.651962 4902 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.659713 4902 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.659763 4902 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.661335 4902 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.661568 4902 trace.go:236] Trace[1173631917]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 13:51:00.414) (total time: 12246ms): Oct 09 13:51:12 crc kubenswrapper[4902]: Trace[1173631917]: ---"Objects listed" error: 12246ms (13:51:12.661) Oct 09 13:51:12 crc kubenswrapper[4902]: Trace[1173631917]: [12.246492963s] [12.246492963s] END Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.661705 4902 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.661873 4902 trace.go:236] Trace[1046592579]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 13:51:01.761) (total time: 10900ms): Oct 09 13:51:12 crc kubenswrapper[4902]: Trace[1046592579]: ---"Objects listed" error: 10900ms (13:51:12.661) Oct 09 13:51:12 crc kubenswrapper[4902]: Trace[1046592579]: [10.900216365s] [10.900216365s] END Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.661904 4902 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 09 13:51:12 crc kubenswrapper[4902]: E1009 13:51:12.664633 4902 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.713559 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.731395 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.731512 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:12 crc kubenswrapper[4902]: I1009 13:51:12.745092 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.416733 4902 apiserver.go:52] "Watching apiserver" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.420001 4902 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.420437 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.420774 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.420981 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.420998 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.421052 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.421143 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.421263 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.421435 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.421615 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.421625 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.422427 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.422520 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.423426 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.423792 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.424116 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.424151 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.424191 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.424574 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.426318 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.428962 4902 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.446770 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.459214 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466192 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466230 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466251 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466267 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466298 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466342 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466366 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466381 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466400 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466449 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466468 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466485 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466501 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466566 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466581 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466597 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466614 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466633 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466649 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466683 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466700 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466734 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466753 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466770 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466786 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466804 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466826 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466847 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466869 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466892 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466914 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466936 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466963 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.466988 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467010 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467032 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467050 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467099 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467090 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467133 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467119 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467224 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467261 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467278 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467323 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467329 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467351 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467375 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467399 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467419 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467441 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467467 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467492 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467520 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467552 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467482 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467581 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467562 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467606 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467630 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467657 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467681 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467686 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467705 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467729 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467751 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467773 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467798 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467820 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467845 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467848 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467868 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467875 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467893 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467917 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467922 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467943 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467949 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467949 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.467993 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468083 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468224 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468258 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468397 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468481 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468509 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468564 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468585 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468608 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468628 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468678 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468701 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468747 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468221 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468765 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468293 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468439 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468419 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468467 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468717 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468864 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468743 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468758 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468789 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468788 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468886 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.468975 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469029 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469038 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469070 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469075 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469075 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469128 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469149 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469169 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469247 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470170 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470300 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470430 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470469 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470502 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470536 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470548 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470563 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470587 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470592 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470750 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471419 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471489 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471528 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471559 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471594 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471634 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471663 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471699 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471731 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471800 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471832 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471865 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471895 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471920 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471952 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471982 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472006 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472035 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472065 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472095 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472120 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472153 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472188 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472218 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472253 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472282 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472312 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472335 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472357 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472386 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472453 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472482 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472516 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472543 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472570 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472596 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472624 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472662 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472688 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472723 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472751 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472779 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472892 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472927 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472962 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472999 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473021 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473054 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473086 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473115 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473148 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473184 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473218 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473250 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473280 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473304 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473321 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473343 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473365 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473389 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473425 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473450 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473470 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473505 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473526 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473547 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473602 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473626 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473647 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473913 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473977 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474028 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474052 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474119 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474145 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474203 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474301 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474373 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474432 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474460 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474510 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474546 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475018 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475082 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475132 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475179 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475211 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475256 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475297 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475338 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475368 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475424 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475459 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475491 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475529 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475565 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475597 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475621 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475653 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475684 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475893 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470671 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.479551 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.470749 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471635 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471809 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.471854 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472161 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472066 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472192 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472542 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472599 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.472780 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473035 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473157 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473323 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473342 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473545 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473671 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473742 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.479817 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.473851 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474066 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474502 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474895 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.474886 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475263 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475339 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475382 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475681 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.475699 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476149 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476316 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476283 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.469316 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476458 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476786 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476803 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.476887 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477049 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477280 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477289 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477295 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477602 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.477816 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.478311 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.478522 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.478730 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.478750 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.478883 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.479155 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.479463 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.480403 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481032 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481090 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481098 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481110 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481882 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481924 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.481952 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482106 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482130 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482149 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482844 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484523 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483175 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483156 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483315 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483509 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483545 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483588 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483610 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483761 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483725 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484778 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483825 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.483863 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484068 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484798 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484395 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484692 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484923 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482297 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485055 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485133 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.482945 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485292 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485736 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485754 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485944 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.485971 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.484493 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486073 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486254 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486304 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486364 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486664 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486664 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.486765 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487055 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487060 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487215 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487355 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487381 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487450 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487546 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487464 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487589 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487619 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487641 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.487655 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:13.987627272 +0000 UTC m=+21.185486426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487685 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487768 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487793 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487805 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487875 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487887 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.487928 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488175 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488192 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488212 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488236 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488399 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488461 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488605 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.488859 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489021 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489064 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489095 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489145 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489177 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489369 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489503 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489859 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489903 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489932 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489963 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489995 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490020 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490043 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490143 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490160 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490177 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490194 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490208 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490228 4902 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489681 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490266 4902 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490619 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.489824 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.490704 4902 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490740 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.490789 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:13.990762984 +0000 UTC m=+21.188622048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490811 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490846 4902 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490868 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490886 4902 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490905 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490931 4902 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490950 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490965 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490981 4902 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.490995 4902 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491026 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491051 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491070 4902 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491090 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491109 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491129 4902 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491148 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491167 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491184 4902 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491214 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491245 4902 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491262 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491277 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491295 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491311 4902 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491326 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491342 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491356 4902 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491368 4902 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491379 4902 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491392 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491403 4902 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491435 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491450 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491462 4902 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491474 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491486 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491498 4902 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491510 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491522 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491534 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491551 4902 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491563 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491576 4902 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491588 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491601 4902 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491614 4902 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491626 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491642 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491655 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491666 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491677 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491689 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491700 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491711 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491723 4902 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491734 4902 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491746 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491758 4902 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491769 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491781 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491792 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491804 4902 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491815 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491826 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491838 4902 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491849 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491860 4902 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491873 4902 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491883 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491895 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491906 4902 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491917 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491929 4902 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491942 4902 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491953 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491963 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491977 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.491990 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492001 4902 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492013 4902 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492023 4902 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492035 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492047 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492058 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492070 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492080 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492092 4902 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492103 4902 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492114 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492126 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492138 4902 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492149 4902 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492159 4902 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492169 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492181 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492193 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492205 4902 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492219 4902 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492230 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492240 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492251 4902 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492265 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492277 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492287 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492299 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492311 4902 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492322 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492333 4902 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492344 4902 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492363 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492375 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492386 4902 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492397 4902 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492439 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492454 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492466 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492479 4902 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492490 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492501 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492512 4902 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492526 4902 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492538 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492551 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492562 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492573 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492584 4902 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492595 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492605 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492617 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492628 4902 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492638 4902 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492650 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492661 4902 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492673 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492687 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492697 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492709 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492719 4902 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492732 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492746 4902 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492756 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492772 4902 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492783 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492797 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492809 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.492884 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.493023 4902 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.493074 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:13.993055421 +0000 UTC m=+21.190914555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493165 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493298 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493338 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493427 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493810 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.493893 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.495495 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.496042 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.496246 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.496247 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.496515 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.499260 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.499569 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.500067 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.500085 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.500728 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.501178 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.501436 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.501670 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.501871 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.502329 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.503435 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.505142 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.506851 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.506958 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.507167 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514555 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514613 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514627 4902 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514651 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514699 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:14.014678345 +0000 UTC m=+21.212537679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514705 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.514719 4902 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.517488 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:14.017470887 +0000 UTC m=+21.215329951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.517629 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.519730 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.519808 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520044 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520115 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520204 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520377 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520512 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520530 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.520586 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.521435 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.523692 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.523912 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.524315 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.525556 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.527306 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.527694 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.527822 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.528258 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.535080 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.535985 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.539194 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.539705 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.542130 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.543971 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.544956 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.548237 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.549096 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.554508 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.555616 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.561099 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.561910 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.565741 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.571422 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.582318 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.582991 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.583398 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.584492 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.585027 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.588713 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.589441 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.589884 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.590945 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.591323 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.592379 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593179 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593241 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593302 4902 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593314 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593323 4902 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593333 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593342 4902 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593351 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593360 4902 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593369 4902 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593377 4902 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593386 4902 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593395 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593403 4902 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593427 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593443 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593451 4902 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593460 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593469 4902 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593477 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593485 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593494 4902 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593502 4902 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593510 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593521 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593530 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593539 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593549 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593558 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593597 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593608 4902 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593617 4902 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593626 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593636 4902 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593646 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593654 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593683 4902 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593702 4902 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593711 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593721 4902 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593730 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593738 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.593748 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.594395 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.594617 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.612893 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.614312 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.614788 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.626965 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.627596 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.628546 4902 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.628649 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.630261 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.645668 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.646292 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.648002 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.648824 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.649737 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.650333 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.651350 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.651786 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.652751 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.653743 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.654325 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.654773 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.655737 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.656599 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.657372 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.657866 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.659721 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.662060 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.662874 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.663979 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.664673 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.665116 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.666152 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4d229"] Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.666527 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.672309 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.676752 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.676935 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.677072 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.702722 4902 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 13:51:13 crc kubenswrapper[4902]: E1009 13:51:13.706202 4902 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.711699 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.733562 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.742998 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.747628 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.747831 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: W1009 13:51:13.750992 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c3c4d1a88fcda6aa3617c9b33fe5d94d839ce41581677077fb46155da988aaa9 WatchSource:0}: Error finding container c3c4d1a88fcda6aa3617c9b33fe5d94d839ce41581677077fb46155da988aaa9: Status 404 returned error can't find the container with id c3c4d1a88fcda6aa3617c9b33fe5d94d839ce41581677077fb46155da988aaa9 Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.775218 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.787997 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.798688 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jdd\" (UniqueName: \"kubernetes.io/projected/51bd475b-561d-4140-bea0-6c808e7820e6-kube-api-access-r4jdd\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.798734 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/51bd475b-561d-4140-bea0-6c808e7820e6-serviceca\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.798788 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51bd475b-561d-4140-bea0-6c808e7820e6-host\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.805206 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.826451 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.840052 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.848649 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.872891 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.881334 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.897384 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.899517 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51bd475b-561d-4140-bea0-6c808e7820e6-host\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.899568 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jdd\" (UniqueName: \"kubernetes.io/projected/51bd475b-561d-4140-bea0-6c808e7820e6-kube-api-access-r4jdd\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.899589 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/51bd475b-561d-4140-bea0-6c808e7820e6-serviceca\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.899629 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/51bd475b-561d-4140-bea0-6c808e7820e6-host\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.900751 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/51bd475b-561d-4140-bea0-6c808e7820e6-serviceca\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.913227 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.917931 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jdd\" (UniqueName: \"kubernetes.io/projected/51bd475b-561d-4140-bea0-6c808e7820e6-kube-api-access-r4jdd\") pod \"node-ca-4d229\" (UID: \"51bd475b-561d-4140-bea0-6c808e7820e6\") " pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.925927 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.936563 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.950184 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.977350 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2cmrx"] Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.977682 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.979312 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.979475 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.980917 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.981137 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.988539 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4d229" Oct 09 13:51:13 crc kubenswrapper[4902]: I1009 13:51:13.998997 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.000496 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.000643 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.000679 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.000798 4902 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.000852 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:15.000831419 +0000 UTC m=+22.198690493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.000921 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:15.000913292 +0000 UTC m=+22.198772356 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.000968 4902 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.000996 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:15.000985334 +0000 UTC m=+22.198844388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.010804 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.018863 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.031924 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.049193 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.066041 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.076898 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.085706 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.100034 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.101697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-hosts-file\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.101757 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.101789 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b276\" (UniqueName: \"kubernetes.io/projected/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-kube-api-access-2b276\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.101824 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.101870 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.101886 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.101985 4902 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.102026 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:15.102013154 +0000 UTC m=+22.299872218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.101924 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.102057 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.102064 4902 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.102096 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:15.102089826 +0000 UTC m=+22.299948890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.120044 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.130568 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.138051 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.147438 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2cmrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2b276\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2cmrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.170001 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.183752 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.202304 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-hosts-file\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.202348 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b276\" (UniqueName: \"kubernetes.io/projected/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-kube-api-access-2b276\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.202499 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-hosts-file\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.207271 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.219600 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.313887 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b276\" (UniqueName: \"kubernetes.io/projected/b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f-kube-api-access-2b276\") pod \"node-resolver-2cmrx\" (UID: \"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\") " pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.339471 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fcz75"] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.339827 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-gbt7s"] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.339976 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.340044 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4wnpl"] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.340460 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.340690 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.341559 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.341786 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.341819 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.342092 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.342193 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.346249 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.346371 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.346429 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.346577 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.346673 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.348030 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.349602 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.376331 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.389673 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.409109 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.419374 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcz75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593745e8-10e2-486a-8a32-9e2dc766bc55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcz75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.431125 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.445808 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.453985 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.464354 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2cmrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2b276\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2cmrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.477533 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.495245 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504246 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-hostroot\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504296 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-bin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504365 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-mcd-auth-proxy-config\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504389 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6csxv\" (UniqueName: \"kubernetes.io/projected/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-kube-api-access-6csxv\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504437 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504465 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-multus\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504522 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-netns\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504553 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-proxy-tls\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504839 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-cnibin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504907 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhfd\" (UniqueName: \"kubernetes.io/projected/593745e8-10e2-486a-8a32-9e2dc766bc55-kube-api-access-srhfd\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504937 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-os-release\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504966 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-os-release\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.504994 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505015 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-cni-binary-copy\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505036 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-etc-kubernetes\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505055 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505076 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-kubelet\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505092 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-cnibin\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505205 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505297 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-system-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505334 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-socket-dir-parent\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505356 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-multus-certs\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505371 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-system-cni-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505388 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-daemon-config\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505404 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crq9n\" (UniqueName: \"kubernetes.io/projected/795e40f3-9c75-4b61-9d94-ca0818875b9f-kube-api-access-crq9n\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505452 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-k8s-cni-cncf-io\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505468 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-conf-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.505492 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-rootfs\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.514729 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.540540 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.554848 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.570529 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.581670 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.593216 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2cmrx" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.595899 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: W1009 13:51:14.605104 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8abd493_cf67_48c1_9bdb_b8e4f16e0b1f.slice/crio-fe12c38b92e9adb1531eb499a544791d6355278c38a25697fa30cea090517902 WatchSource:0}: Error finding container fe12c38b92e9adb1531eb499a544791d6355278c38a25697fa30cea090517902: Status 404 returned error can't find the container with id fe12c38b92e9adb1531eb499a544791d6355278c38a25697fa30cea090517902 Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606382 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606442 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-system-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606467 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-socket-dir-parent\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606488 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-multus-certs\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606509 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-system-cni-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606530 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-conf-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606552 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-daemon-config\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606573 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crq9n\" (UniqueName: \"kubernetes.io/projected/795e40f3-9c75-4b61-9d94-ca0818875b9f-kube-api-access-crq9n\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606598 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-k8s-cni-cncf-io\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606629 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-rootfs\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606650 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-hostroot\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606675 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-bin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606698 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-mcd-auth-proxy-config\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606720 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6csxv\" (UniqueName: \"kubernetes.io/projected/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-kube-api-access-6csxv\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606742 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606769 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-multus\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606791 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-netns\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606812 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-proxy-tls\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606832 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-cnibin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606863 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhfd\" (UniqueName: \"kubernetes.io/projected/593745e8-10e2-486a-8a32-9e2dc766bc55-kube-api-access-srhfd\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606884 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-os-release\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606904 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-os-release\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606922 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606941 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-cni-binary-copy\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606960 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-etc-kubernetes\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.606995 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-cnibin\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607015 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607040 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-kubelet\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607113 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-kubelet\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607168 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-system-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607208 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-socket-dir-parent\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607239 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-multus-certs\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607266 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-system-cni-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607294 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-conf-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607884 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-netns\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.607940 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-daemon-config\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608213 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-run-k8s-cni-cncf-io\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608254 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-rootfs\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608280 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-hostroot\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608283 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608303 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-bin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608458 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-multus-cni-dir\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608559 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-cnibin\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608628 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-os-release\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608675 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-os-release\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608686 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-etc-kubernetes\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608710 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/795e40f3-9c75-4b61-9d94-ca0818875b9f-cnibin\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.608822 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/593745e8-10e2-486a-8a32-9e2dc766bc55-host-var-lib-cni-multus\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.609134 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-mcd-auth-proxy-config\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.609315 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.609449 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/795e40f3-9c75-4b61-9d94-ca0818875b9f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.609621 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/593745e8-10e2-486a-8a32-9e2dc766bc55-cni-binary-copy\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.612154 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-proxy-tls\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.615846 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcz75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593745e8-10e2-486a-8a32-9e2dc766bc55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcz75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.625318 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6csxv\" (UniqueName: \"kubernetes.io/projected/6cfbac91-e798-4e5e-9f3c-f454ea6f457e-kube-api-access-6csxv\") pod \"machine-config-daemon-gbt7s\" (UID: \"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\") " pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.632694 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhfd\" (UniqueName: \"kubernetes.io/projected/593745e8-10e2-486a-8a32-9e2dc766bc55-kube-api-access-srhfd\") pod \"multus-fcz75\" (UID: \"593745e8-10e2-486a-8a32-9e2dc766bc55\") " pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.635358 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crq9n\" (UniqueName: \"kubernetes.io/projected/795e40f3-9c75-4b61-9d94-ca0818875b9f-kube-api-access-crq9n\") pod \"multus-additional-cni-plugins-4wnpl\" (UID: \"795e40f3-9c75-4b61-9d94-ca0818875b9f\") " pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.652714 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fcz75" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.658616 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6csxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6csxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gbt7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.661450 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.672789 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.682102 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.682662 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 13:51:14 crc kubenswrapper[4902]: W1009 13:51:14.688033 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cfbac91_e798_4e5e_9f3c_f454ea6f457e.slice/crio-880aa57325a47b64261316b4ec2404af29e6d5df85cc17914824926cc9dd4619 WatchSource:0}: Error finding container 880aa57325a47b64261316b4ec2404af29e6d5df85cc17914824926cc9dd4619: Status 404 returned error can't find the container with id 880aa57325a47b64261316b4ec2404af29e6d5df85cc17914824926cc9dd4619 Oct 09 13:51:14 crc kubenswrapper[4902]: W1009 13:51:14.689029 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795e40f3_9c75_4b61_9d94_ca0818875b9f.slice/crio-41f94dc6ffe330df9a59c42e1965c4af58eeb22e2f88368b7822650ee3edf726 WatchSource:0}: Error finding container 41f94dc6ffe330df9a59c42e1965c4af58eeb22e2f88368b7822650ee3edf726: Status 404 returned error can't find the container with id 41f94dc6ffe330df9a59c42e1965c4af58eeb22e2f88368b7822650ee3edf726 Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.689378 4902 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0" exitCode=255 Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.689437 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.689992 4902 scope.go:117] "RemoveContainer" containerID="7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.692762 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ad4b6546ae6e4b07e0875a2f33456213069bdfff009d4cf7b241c9d968395dbf"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.692789 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2662da95296fd06458cc257e8d0a2c11c6d87cc0e666a77b672eddc7eab92422"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.692803 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ff97be7f363824b3c7f408cfd4dd0103b22b90d9f384584ab33683924395531f"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.705134 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7b9945f3f4248f5ac0afe6c143a10af8e2ba531f2cf82aff0fb6d4eaadc578f2"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.705197 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c3c4d1a88fcda6aa3617c9b33fe5d94d839ce41581677077fb46155da988aaa9"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.707629 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2cmrx" event={"ID":"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f","Type":"ContainerStarted","Data":"fe12c38b92e9adb1531eb499a544791d6355278c38a25697fa30cea090517902"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.711865 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.717895 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4d229" event={"ID":"51bd475b-561d-4140-bea0-6c808e7820e6","Type":"ContainerStarted","Data":"fde43a008c9bd3170272867268dcffa015fe7365cd583fa3e9e4e4571f83d258"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.717963 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4d229" event={"ID":"51bd475b-561d-4140-bea0-6c808e7820e6","Type":"ContainerStarted","Data":"4b0c777b58910f2a568c5e0e7c65c18fe9f7378df42db9b894b0dcbd63600053"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.720193 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8bd67cdbc08212891654fad9ee090a35dbf2db4837e6440d67ef7fe2cea27d03"} Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.740580 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.740626 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh6wc"] Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.739486 4902 scope.go:117] "RemoveContainer" containerID="801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0" Oct 09 13:51:14 crc kubenswrapper[4902]: E1009 13:51:14.744350 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.746004 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.764747 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.768037 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.782935 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.806603 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.822679 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.840362 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.860320 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.882584 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909616 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909656 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909678 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909695 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghml4\" (UniqueName: \"kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909716 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909730 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909775 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909798 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909814 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909847 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909869 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909891 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909912 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909932 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909947 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909962 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.909977 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.910016 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.910035 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.910052 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.932459 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"795e40f3-9c75-4b61-9d94-ca0818875b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wnpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:14 crc kubenswrapper[4902]: I1009 13:51:14.967827 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:14Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010166 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010451 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010518 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010537 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghml4\" (UniqueName: \"kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010566 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010585 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010610 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010624 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010646 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010667 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010690 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010712 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010731 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010749 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010764 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010779 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010793 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010820 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010835 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010851 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010865 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010881 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010901 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010922 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.010982 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.011053 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.011039489 +0000 UTC m=+24.208898553 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011075 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011182 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011206 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011227 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011247 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.011268 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012078 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012117 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012126 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012165 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012078 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.012090 4902 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012161 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012126 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012191 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.012263 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.012240454 +0000 UTC m=+24.210099588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.012099 4902 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.012318 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.012304436 +0000 UTC m=+24.210163580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012428 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012663 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.012883 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.067433 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2cmrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2b276\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2cmrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.110015 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.112351 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.112673 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.112614 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.113032 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.113148 4902 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.113284 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.113267504 +0000 UTC m=+24.311126568 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.112852 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.114028 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.114091 4902 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.114190 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.114178521 +0000 UTC m=+24.312037585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.113957 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.114337 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghml4\" (UniqueName: \"kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4\") pod \"ovnkube-node-jh6wc\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.151103 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4e01a6-0264-4986-93d7-994fb745add8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee1d2a158f73255d3f748df9054d82134fac0cf488214dd4f6356a2f1735fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ef6f7109f7fbc85c4ca76ab10ab6bd6d1e468603d4a9e574860cf87c9131516\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33afc13cb49c16b4bb2da343a51e2f2a2bea347937b5406853bc668f5e7bebf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc4f1f35b8ea736948d193ba4cd302f544ee0a946080a37b1ac68a8256c71f9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T13:51:07Z\\\",\\\"message\\\":\\\"W1009 13:50:56.657522 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1009 13:50:56.657826 1 crypto.go:601] Generating new CA for check-endpoints-signer@1760017856 cert, and key in /tmp/serving-cert-2799461128/serving-signer.crt, /tmp/serving-cert-2799461128/serving-signer.key\\\\nI1009 13:50:57.003887 1 observer_polling.go:159] Starting file observer\\\\nW1009 13:50:57.007531 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1009 13:50:57.007765 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 13:50:57.010581 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2799461128/tls.crt::/tmp/serving-cert-2799461128/tls.key\\\\\\\"\\\\nF1009 13:51:07.393604 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"file observer\\\\nW1009 13:51:13.363435 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 13:51:13.363597 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 13:51:13.364243 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-402438471/tls.crt::/tmp/serving-cert-402438471/tls.key\\\\\\\"\\\\nI1009 13:51:13.637034 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 13:51:13.651876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 13:51:13.651902 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 13:51:13.651926 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 13:51:13.651932 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 13:51:13.667163 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 13:51:13.667191 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 13:51:13.667195 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 13:51:13.667199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 13:51:13.667203 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1009 13:51:13.667206 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 13:51:13.667208 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1009 13:51:13.667371 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1009 13:51:13.683663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T13:51:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://068eb68236db7e1816c9050c84c73b443303d5b33aeb3903d535713e206ea81b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92329c4c1ce95b34b952697f3a48a0da94ceff26f0bf610b923b00b4e6c632cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92329c4c1ce95b34b952697f3a48a0da94ceff26f0bf610b923b00b4e6c632cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.199099 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4254914b-a289-4953-a5c4-4a33bf1eda40\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98dc7fe9418570351647255f78a18bc6f72da3a8edb045968b3adcd91de76eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4affb65427006cf0c0f0f13bd35be9ae8fd913a866fef32f96db408559b1709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a02c67b01c587d22709968939f075c03cade5a06e955b7930826811cd10452dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fea213c0d33a5605cbec42814ccf493a61a06690756ff43e87adf27556456b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55f958eb865e1564197d77e4d3114018876a8dc0a94b1a18ef9c223b08372b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7cf41ee52bad91ad430c8c1b6c7a32a0484acc01e0d4f5f33dc8abdd36dbc83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c2214871f6eecfacd86a956b2bb962815374eb365e6dc4ff4c053d716ac33b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7be9fb4fa25ebe19bbc13a915cb004341bb51547fe20a014c59da04431e25484\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:50:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:50:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.229811 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.275544 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fcz75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"593745e8-10e2-486a-8a32-9e2dc766bc55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-srhfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fcz75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.308785 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cfbac91-e798-4e5e-9f3c-f454ea6f457e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6csxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6csxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gbt7s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.353223 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9945f3f4248f5ac0afe6c143a10af8e2ba531f2cf82aff0fb6d4eaadc578f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.388945 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.388972 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:15 crc kubenswrapper[4902]: W1009 13:51:15.400795 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4904c756_7ed4_4719_860f_c6f6458d002c.slice/crio-22c5291e39f4205f234f684ec8f36d4c77c7b493206767496f3d055781bc956c WatchSource:0}: Error finding container 22c5291e39f4205f234f684ec8f36d4c77c7b493206767496f3d055781bc956c: Status 404 returned error can't find the container with id 22c5291e39f4205f234f684ec8f36d4c77c7b493206767496f3d055781bc956c Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.433895 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4904c756-7ed4-4719-860f-c6f6458d002c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghml4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh6wc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.472891 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.508452 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b6546ae6e4b07e0875a2f33456213069bdfff009d4cf7b241c9d968395dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2662da95296fd06458cc257e8d0a2c11c6d87cc0e666a77b672eddc7eab92422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.514695 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.514878 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.514889 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.514988 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.515108 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.515288 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.552313 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"795e40f3-9c75-4b61-9d94-ca0818875b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wnpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.591700 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.632011 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4d229" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51bd475b-561d-4140-bea0-6c808e7820e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fde43a008c9bd3170272867268dcffa015fe7365cd583fa3e9e4e4571f83d258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jdd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4d229\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.667755 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2cmrx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2b276\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2cmrx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.723549 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="cd31b8a925e85981ceea3660f9f8df069d1cf4d6a89fc223ea18e8be218f9e52" exitCode=0 Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.723636 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"cd31b8a925e85981ceea3660f9f8df069d1cf4d6a89fc223ea18e8be218f9e52"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.723703 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerStarted","Data":"41f94dc6ffe330df9a59c42e1965c4af58eeb22e2f88368b7822650ee3edf726"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.725021 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"027bb1ab92bd8ff9e4b24044584aed74aab9130a1f137512fdc91c00ae920c74"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.725044 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.725053 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"880aa57325a47b64261316b4ec2404af29e6d5df85cc17914824926cc9dd4619"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.728095 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.730219 4902 scope.go:117] "RemoveContainer" containerID="801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0" Oct 09 13:51:15 crc kubenswrapper[4902]: E1009 13:51:15.730433 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.730672 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcz75" event={"ID":"593745e8-10e2-486a-8a32-9e2dc766bc55","Type":"ContainerStarted","Data":"cf52d4820c410545c40099536788fd8a2655cb2aa1a2fbbb0d38c9afc19eb7b8"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.730705 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcz75" event={"ID":"593745e8-10e2-486a-8a32-9e2dc766bc55","Type":"ContainerStarted","Data":"8c4c645e0bb1d3ccbee1a8ba66b4f39c7f09d50a7153d4275daba9ba8579b9b6"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.732022 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2cmrx" event={"ID":"b8abd493-cf67-48c1-9bdb-b8e4f16e0b1f","Type":"ContainerStarted","Data":"11b7ada627a87ee278a9c08f845dc146ef44f2c9c2c69eea950b1c68e2afe219"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.733838 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" exitCode=0 Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.733910 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.733937 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"22c5291e39f4205f234f684ec8f36d4c77c7b493206767496f3d055781bc956c"} Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.740290 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cea21e6-d0ed-4fae-b4ac-38001260cc01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:50:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eade4433c62920ab3fd348633aee95ebe54a5f547b05898dbfeb09da6216ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62c6e61d996fc57028ccd9f85284b9cee1b860614daa4255076c11de63811e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d34afa6885744750f7be2aca8cac77dd920a434dfd577bbbcd961a4c45595a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d565a380c28c84e49e3d59625b61ff4c551c11ee4b137b1447005a1127b527cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:50:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:50:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.754432 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4b6546ae6e4b07e0875a2f33456213069bdfff009d4cf7b241c9d968395dbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2662da95296fd06458cc257e8d0a2c11c6d87cc0e666a77b672eddc7eab92422\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T13:51:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.797928 4902 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"795e40f3-9c75-4b61-9d94-ca0818875b9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T13:51:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd31b8a925e85981ceea3660f9f8df069d1cf4d6a89fc223ea18e8be218f9e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd31b8a925e85981ceea3660f9f8df069d1cf4d6a89fc223ea18e8be218f9e52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T13:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T13:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-crq9n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T13:51:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wnpl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T13:51:15Z is after 2025-08-24T17:21:41Z" Oct 09 13:51:15 crc kubenswrapper[4902]: I1009 13:51:15.965604 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4d229" podStartSLOduration=2.9655810320000002 podStartE2EDuration="2.965581032s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:15.963783559 +0000 UTC m=+23.161642623" watchObservedRunningTime="2025-10-09 13:51:15.965581032 +0000 UTC m=+23.163440096" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.103100 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.103075694 podStartE2EDuration="4.103075694s" podCreationTimestamp="2025-10-09 13:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:16.095744899 +0000 UTC m=+23.293603973" watchObservedRunningTime="2025-10-09 13:51:16.103075694 +0000 UTC m=+23.300934758" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.313908 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2cmrx" podStartSLOduration=3.3138859419999998 podStartE2EDuration="3.313885942s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:16.312857522 +0000 UTC m=+23.510716586" watchObservedRunningTime="2025-10-09 13:51:16.313885942 +0000 UTC m=+23.511745026" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.353274 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp"] Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.353800 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.360147 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.366240 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.366293 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.366314 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9n4\" (UniqueName: \"kubernetes.io/projected/b84e60b8-b61b-42fb-a7cc-60e335659c2c-kube-api-access-dn9n4\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.366353 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.379916 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.388066 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5tnbn"] Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.388481 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: E1009 13:51:16.388536 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.429190 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fcz75" podStartSLOduration=3.42916681 podStartE2EDuration="3.42916681s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:16.428632994 +0000 UTC m=+23.626492058" watchObservedRunningTime="2025-10-09 13:51:16.42916681 +0000 UTC m=+23.627025884" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467493 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467545 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7x5\" (UniqueName: \"kubernetes.io/projected/76bff3cb-cf9e-42cc-8b73-846ad6b38202-kube-api-access-lj7x5\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467599 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467655 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467680 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.467704 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9n4\" (UniqueName: \"kubernetes.io/projected/b84e60b8-b61b-42fb-a7cc-60e335659c2c-kube-api-access-dn9n4\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.468720 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.468961 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.471294 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podStartSLOduration=3.471284274 podStartE2EDuration="3.471284274s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:16.470644305 +0000 UTC m=+23.668503389" watchObservedRunningTime="2025-10-09 13:51:16.471284274 +0000 UTC m=+23.669143328" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.475485 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84e60b8-b61b-42fb-a7cc-60e335659c2c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.497075 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9n4\" (UniqueName: \"kubernetes.io/projected/b84e60b8-b61b-42fb-a7cc-60e335659c2c-kube-api-access-dn9n4\") pod \"ovnkube-control-plane-749d76644c-msbfp\" (UID: \"b84e60b8-b61b-42fb-a7cc-60e335659c2c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.517820 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" Oct 09 13:51:16 crc kubenswrapper[4902]: W1009 13:51:16.529027 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84e60b8_b61b_42fb_a7cc_60e335659c2c.slice/crio-766be2dbb135fc707a5d579bd1319c0f5e871451d2e950260456da509c5b155e WatchSource:0}: Error finding container 766be2dbb135fc707a5d579bd1319c0f5e871451d2e950260456da509c5b155e: Status 404 returned error can't find the container with id 766be2dbb135fc707a5d579bd1319c0f5e871451d2e950260456da509c5b155e Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.568233 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7x5\" (UniqueName: \"kubernetes.io/projected/76bff3cb-cf9e-42cc-8b73-846ad6b38202-kube-api-access-lj7x5\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.568297 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: E1009 13:51:16.568571 4902 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:16 crc kubenswrapper[4902]: E1009 13:51:16.568643 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs podName:76bff3cb-cf9e-42cc-8b73-846ad6b38202 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:17.068622756 +0000 UTC m=+24.266481830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs") pod "network-metrics-daemon-5tnbn" (UID: "76bff3cb-cf9e-42cc-8b73-846ad6b38202") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.597199 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7x5\" (UniqueName: \"kubernetes.io/projected/76bff3cb-cf9e-42cc-8b73-846ad6b38202-kube-api-access-lj7x5\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.672702 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=4.672684065 podStartE2EDuration="4.672684065s" podCreationTimestamp="2025-10-09 13:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:16.670265964 +0000 UTC m=+23.868125038" watchObservedRunningTime="2025-10-09 13:51:16.672684065 +0000 UTC m=+23.870543139" Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.744039 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="e647ed8426a359cc3bb5522d1fef315161beb48022d2deddcf78808d06cf4464" exitCode=0 Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.744108 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"e647ed8426a359cc3bb5522d1fef315161beb48022d2deddcf78808d06cf4464"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.750934 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1dbbab68a8a40264a5a6e6db68bd7fc5ed373797507b5bd29cfef860cf912491"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.755845 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.755908 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.755918 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.755927 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.756921 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" event={"ID":"b84e60b8-b61b-42fb-a7cc-60e335659c2c","Type":"ContainerStarted","Data":"3ba0200d171891c79ebea686f5ceef45dfe1af73db5616f9c1530b699b499865"} Oct 09 13:51:16 crc kubenswrapper[4902]: I1009 13:51:16.756953 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" event={"ID":"b84e60b8-b61b-42fb-a7cc-60e335659c2c","Type":"ContainerStarted","Data":"766be2dbb135fc707a5d579bd1319c0f5e871451d2e950260456da509c5b155e"} Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.077015 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077232 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:21.077200687 +0000 UTC m=+28.275059751 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.077479 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.077522 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.077551 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077647 4902 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077696 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:21.077688202 +0000 UTC m=+28.275547256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077690 4902 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077712 4902 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077802 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs podName:76bff3cb-cf9e-42cc-8b73-846ad6b38202 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:18.077783935 +0000 UTC m=+25.275642999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs") pod "network-metrics-daemon-5tnbn" (UID: "76bff3cb-cf9e-42cc-8b73-846ad6b38202") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.077846 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:21.077822736 +0000 UTC m=+28.275681830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.178985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.179116 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179193 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179234 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179248 4902 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179282 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179309 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:21.179289889 +0000 UTC m=+28.377149013 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179317 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179347 4902 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.179464 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:21.179402282 +0000 UTC m=+28.377261386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.514655 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.514752 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.514796 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.514876 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.514936 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:17 crc kubenswrapper[4902]: E1009 13:51:17.514975 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.763403 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.763470 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.764781 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" event={"ID":"b84e60b8-b61b-42fb-a7cc-60e335659c2c","Type":"ContainerStarted","Data":"28e782fc960f5cdcb88fcde184da8c057068015e713121fc0b0e968cb15338fc"} Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.767825 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="c53345644f10f9d1b2609d1b6efd0a4c59cbe5d9464198b2515c6024077d35cb" exitCode=0 Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.767910 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"c53345644f10f9d1b2609d1b6efd0a4c59cbe5d9464198b2515c6024077d35cb"} Oct 09 13:51:17 crc kubenswrapper[4902]: I1009 13:51:17.806641 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-msbfp" podStartSLOduration=3.80661366 podStartE2EDuration="3.80661366s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:17.782684688 +0000 UTC m=+24.980543772" watchObservedRunningTime="2025-10-09 13:51:17.80661366 +0000 UTC m=+25.004472734" Oct 09 13:51:18 crc kubenswrapper[4902]: I1009 13:51:18.089387 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:18 crc kubenswrapper[4902]: E1009 13:51:18.089580 4902 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:18 crc kubenswrapper[4902]: E1009 13:51:18.089888 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs podName:76bff3cb-cf9e-42cc-8b73-846ad6b38202 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:20.089870549 +0000 UTC m=+27.287729623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs") pod "network-metrics-daemon-5tnbn" (UID: "76bff3cb-cf9e-42cc-8b73-846ad6b38202") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:18 crc kubenswrapper[4902]: I1009 13:51:18.512723 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:18 crc kubenswrapper[4902]: E1009 13:51:18.512859 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:18 crc kubenswrapper[4902]: I1009 13:51:18.774133 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="7f6b10fc2929dd7bf8a1202c992607025901b7a352b63d9fda8b776260bf9538" exitCode=0 Oct 09 13:51:18 crc kubenswrapper[4902]: I1009 13:51:18.774223 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"7f6b10fc2929dd7bf8a1202c992607025901b7a352b63d9fda8b776260bf9538"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.065009 4902 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.067074 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.067117 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.067128 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.067237 4902 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.074598 4902 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.074973 4902 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.076447 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.076485 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.076495 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.076510 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.076520 4902 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T13:51:19Z","lastTransitionTime":"2025-10-09T13:51:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.122454 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5"] Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.122931 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.124400 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.124675 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.124727 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.125741 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.205946 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44395e4d-0b50-4bda-afc8-7de78ecddcd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.206016 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44395e4d-0b50-4bda-afc8-7de78ecddcd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.206035 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44395e4d-0b50-4bda-afc8-7de78ecddcd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.206050 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.206081 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307390 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44395e4d-0b50-4bda-afc8-7de78ecddcd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307455 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44395e4d-0b50-4bda-afc8-7de78ecddcd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307490 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307524 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307590 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44395e4d-0b50-4bda-afc8-7de78ecddcd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307618 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.307630 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/44395e4d-0b50-4bda-afc8-7de78ecddcd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.308300 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44395e4d-0b50-4bda-afc8-7de78ecddcd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.319495 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44395e4d-0b50-4bda-afc8-7de78ecddcd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.324212 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44395e4d-0b50-4bda-afc8-7de78ecddcd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xgmt5\" (UID: \"44395e4d-0b50-4bda-afc8-7de78ecddcd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.435756 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" Oct 09 13:51:19 crc kubenswrapper[4902]: W1009 13:51:19.448301 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44395e4d_0b50_4bda_afc8_7de78ecddcd8.slice/crio-d9521eff41446c8dff41a5625110aa6a5e78b75c54d4c9e39847f7075f53aeae WatchSource:0}: Error finding container d9521eff41446c8dff41a5625110aa6a5e78b75c54d4c9e39847f7075f53aeae: Status 404 returned error can't find the container with id d9521eff41446c8dff41a5625110aa6a5e78b75c54d4c9e39847f7075f53aeae Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.512152 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.512190 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.512231 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:19 crc kubenswrapper[4902]: E1009 13:51:19.512931 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:19 crc kubenswrapper[4902]: E1009 13:51:19.513019 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:19 crc kubenswrapper[4902]: E1009 13:51:19.513131 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.780450 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"3aad02ff1803406262beeca79b69690f31edb0b935eef21ad18a0a606e819630"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.780451 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="3aad02ff1803406262beeca79b69690f31edb0b935eef21ad18a0a606e819630" exitCode=0 Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.784835 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" event={"ID":"44395e4d-0b50-4bda-afc8-7de78ecddcd8","Type":"ContainerStarted","Data":"80b04e58d954233149e08abe1271dc4b10f4d6ad8a44fe2d0705a1bcb432e79d"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.785147 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" event={"ID":"44395e4d-0b50-4bda-afc8-7de78ecddcd8","Type":"ContainerStarted","Data":"d9521eff41446c8dff41a5625110aa6a5e78b75c54d4c9e39847f7075f53aeae"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.795490 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} Oct 09 13:51:19 crc kubenswrapper[4902]: I1009 13:51:19.832740 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xgmt5" podStartSLOduration=6.832720254 podStartE2EDuration="6.832720254s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:19.832480537 +0000 UTC m=+27.030339621" watchObservedRunningTime="2025-10-09 13:51:19.832720254 +0000 UTC m=+27.030579318" Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.117318 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:20 crc kubenswrapper[4902]: E1009 13:51:20.117571 4902 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:20 crc kubenswrapper[4902]: E1009 13:51:20.117654 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs podName:76bff3cb-cf9e-42cc-8b73-846ad6b38202 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:24.117625082 +0000 UTC m=+31.315484146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs") pod "network-metrics-daemon-5tnbn" (UID: "76bff3cb-cf9e-42cc-8b73-846ad6b38202") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.512639 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:20 crc kubenswrapper[4902]: E1009 13:51:20.512854 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.736888 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.737736 4902 scope.go:117] "RemoveContainer" containerID="801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0" Oct 09 13:51:20 crc kubenswrapper[4902]: E1009 13:51:20.737939 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.804017 4902 generic.go:334] "Generic (PLEG): container finished" podID="795e40f3-9c75-4b61-9d94-ca0818875b9f" containerID="9050969e4d5c199c3b4fd9c1e7307a355dba6d9dda93604c3bed90b018179098" exitCode=0 Oct 09 13:51:20 crc kubenswrapper[4902]: I1009 13:51:20.804096 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerDied","Data":"9050969e4d5c199c3b4fd9c1e7307a355dba6d9dda93604c3bed90b018179098"} Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.128949 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.129245 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:29.129200732 +0000 UTC m=+36.327059796 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.129807 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.129849 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.129972 4902 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.130008 4902 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.130065 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:29.130041556 +0000 UTC m=+36.327900620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.130093 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:29.130081477 +0000 UTC m=+36.327940921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.230696 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.230753 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.230893 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.230909 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.230920 4902 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.230964 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:29.230951053 +0000 UTC m=+36.428810117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.231282 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.231300 4902 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.231308 4902 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.231334 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:29.231325064 +0000 UTC m=+36.429184128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.512823 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.512873 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.512979 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.513099 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.513292 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:21 crc kubenswrapper[4902]: E1009 13:51:21.513380 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.821140 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerStarted","Data":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.821310 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.824528 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" event={"ID":"795e40f3-9c75-4b61-9d94-ca0818875b9f","Type":"ContainerStarted","Data":"841baaf09bd99e6162ad7b23439df0b1b0e4fe899c89b80a9d00362757dd4878"} Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.847260 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.857796 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podStartSLOduration=8.857775729 podStartE2EDuration="8.857775729s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:21.856940655 +0000 UTC m=+29.054799759" watchObservedRunningTime="2025-10-09 13:51:21.857775729 +0000 UTC m=+29.055634813" Oct 09 13:51:21 crc kubenswrapper[4902]: I1009 13:51:21.878652 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4wnpl" podStartSLOduration=8.87863234 podStartE2EDuration="8.87863234s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:21.87829524 +0000 UTC m=+29.076154314" watchObservedRunningTime="2025-10-09 13:51:21.87863234 +0000 UTC m=+29.076491414" Oct 09 13:51:22 crc kubenswrapper[4902]: I1009 13:51:22.512986 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:22 crc kubenswrapper[4902]: E1009 13:51:22.513234 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:22 crc kubenswrapper[4902]: I1009 13:51:22.827695 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 13:51:22 crc kubenswrapper[4902]: I1009 13:51:22.828319 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:22 crc kubenswrapper[4902]: I1009 13:51:22.858477 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.512265 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.512293 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.512320 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:23 crc kubenswrapper[4902]: E1009 13:51:23.513478 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:23 crc kubenswrapper[4902]: E1009 13:51:23.513974 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:23 crc kubenswrapper[4902]: E1009 13:51:23.514045 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.833254 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.952542 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5tnbn"] Oct 09 13:51:23 crc kubenswrapper[4902]: I1009 13:51:23.952676 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:23 crc kubenswrapper[4902]: E1009 13:51:23.952762 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:24 crc kubenswrapper[4902]: I1009 13:51:24.167940 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:24 crc kubenswrapper[4902]: E1009 13:51:24.168154 4902 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:24 crc kubenswrapper[4902]: E1009 13:51:24.168243 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs podName:76bff3cb-cf9e-42cc-8b73-846ad6b38202 nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.168223795 +0000 UTC m=+39.366082859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs") pod "network-metrics-daemon-5tnbn" (UID: "76bff3cb-cf9e-42cc-8b73-846ad6b38202") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 13:51:24 crc kubenswrapper[4902]: I1009 13:51:24.836491 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 13:51:25 crc kubenswrapper[4902]: I1009 13:51:25.512860 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:25 crc kubenswrapper[4902]: I1009 13:51:25.512948 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:25 crc kubenswrapper[4902]: I1009 13:51:25.512948 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:25 crc kubenswrapper[4902]: E1009 13:51:25.513067 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 13:51:25 crc kubenswrapper[4902]: I1009 13:51:25.513215 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:25 crc kubenswrapper[4902]: E1009 13:51:25.513202 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 13:51:25 crc kubenswrapper[4902]: E1009 13:51:25.513314 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5tnbn" podUID="76bff3cb-cf9e-42cc-8b73-846ad6b38202" Oct 09 13:51:25 crc kubenswrapper[4902]: E1009 13:51:25.513478 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.281435 4902 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.281665 4902 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.334172 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.334948 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.335274 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.335782 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.335926 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.336527 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.339564 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.340220 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.346170 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lm5vg"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.346937 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.347924 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ltv8d"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.348259 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.349366 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bszj2"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.350339 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.350852 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.351300 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.352110 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.352623 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.353149 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.353856 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bhzpg"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.354269 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.354751 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.356263 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.356829 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mzkkz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.357353 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.357938 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.358382 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.358685 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.359468 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.367002 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.367852 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.369698 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.372800 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.373190 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.393708 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.394006 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.394554 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.394693 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.394783 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395150 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395304 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395372 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395483 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395481 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395604 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395654 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395691 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395766 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395791 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395833 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395859 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395935 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396060 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396086 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396108 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396229 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396266 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396285 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396386 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396393 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396464 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396499 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396514 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.395536 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396560 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396472 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396606 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396650 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396687 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396700 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396777 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396788 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396505 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396813 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396905 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396926 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.396981 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397033 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397066 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397081 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397360 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397368 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397399 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397434 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397370 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397499 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397580 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397583 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397691 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397701 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397711 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397658 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397827 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397850 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397871 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397870 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397931 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.397995 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398322 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398410 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398462 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398574 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398605 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.398611 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.400619 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.400804 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.400935 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.401084 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.401208 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.402715 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.402895 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.408913 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.409769 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410007 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410217 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410258 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410032 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410268 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410537 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.410628 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.412125 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.412274 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.413298 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.413403 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2nl8"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.437035 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.437570 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441614 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-audit-policies\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441666 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-node-pullsecrets\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441721 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441744 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b6a424-7606-420a-802d-1886adaa3e3d-audit-dir\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441766 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-image-import-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441792 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-trusted-ca\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441828 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441854 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441874 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6m8\" (UniqueName: \"kubernetes.io/projected/dfee5c28-dc5e-4f85-87d2-29925eeff49d-kube-api-access-zt6m8\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441906 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzvpl\" (UniqueName: \"kubernetes.io/projected/ff2ea387-8171-4259-ac56-f864a65f105f-kube-api-access-fzvpl\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441953 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441972 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8pn\" (UniqueName: \"kubernetes.io/projected/0387a8b9-8804-48b5-8503-1734f2a15b45-kube-api-access-7x8pn\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.441994 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442015 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442038 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh25q\" (UniqueName: \"kubernetes.io/projected/327b6d28-9130-4476-b8f2-edaf08da45ae-kube-api-access-mh25q\") pod \"downloads-7954f5f757-bhzpg\" (UID: \"327b6d28-9130-4476-b8f2-edaf08da45ae\") " pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442060 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-config\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442131 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5srr\" (UniqueName: \"kubernetes.io/projected/58b6a424-7606-420a-802d-1886adaa3e3d-kube-api-access-d5srr\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442187 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-config\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442208 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47w44\" (UniqueName: \"kubernetes.io/projected/b7308dad-19a5-4675-9874-ee0a814d8aed-kube-api-access-47w44\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442243 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.442366 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.454297 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-config\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456069 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-etcd-serving-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456163 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456269 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2ea387-8171-4259-ac56-f864a65f105f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456287 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456300 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwq5\" (UniqueName: \"kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456337 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456384 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456413 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456467 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456499 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456544 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456601 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7308dad-19a5-4675-9874-ee0a814d8aed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456625 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456676 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456713 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456737 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456778 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-serving-cert\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456803 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0964a12e-7b75-401b-9547-49e5a924ef0b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456862 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0964a12e-7b75-401b-9547-49e5a924ef0b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456890 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456919 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456953 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-images\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456964 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.456927 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457026 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457046 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pmxj\" (UniqueName: \"kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457070 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457093 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457117 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457141 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457163 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f99a811-543c-4b99-a394-9d941401efff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457187 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglzr\" (UniqueName: \"kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457209 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-encryption-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457236 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2ea387-8171-4259-ac56-f864a65f105f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457261 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7445\" (UniqueName: \"kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457285 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6vvh\" (UniqueName: \"kubernetes.io/projected/e714667b-061d-4127-8dd3-47e403ebe079-kube-api-access-n6vvh\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457308 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnfxk\" (UniqueName: \"kubernetes.io/projected/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-kube-api-access-hnfxk\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457534 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e39402-3faf-4b34-a252-e4db0ac90909-serving-cert\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457574 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457602 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-service-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457633 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dfee5c28-dc5e-4f85-87d2-29925eeff49d-machine-approver-tls\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457659 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-etcd-client\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457681 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-serving-cert\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457759 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfl88\" (UniqueName: \"kubernetes.io/projected/85e39402-3faf-4b34-a252-e4db0ac90909-kube-api-access-kfl88\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457790 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-auth-proxy-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457813 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-audit-dir\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457847 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2m9v\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-kube-api-access-l2m9v\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457875 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457899 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457928 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bnhf\" (UniqueName: \"kubernetes.io/projected/6f99a811-543c-4b99-a394-9d941401efff-kube-api-access-7bnhf\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.457952 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0387a8b9-8804-48b5-8503-1734f2a15b45-serving-cert\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.458004 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.458368 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.459900 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-audit\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.459939 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-client\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.459963 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.459983 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-encryption-config\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.460007 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.460031 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.460219 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-qhd5t"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.460930 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.464264 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.465117 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.465783 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.468367 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.469259 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.470485 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.473209 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.474070 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.474640 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.475118 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.475770 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.476712 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.479181 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.479525 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.481047 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.481166 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.481412 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.481523 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.481624 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.484701 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b2mv7"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.485697 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.485988 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.486903 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.487175 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.487443 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.490347 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.490640 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.497895 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.498355 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.499034 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qb2lj"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.509491 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.512939 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.514193 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.516636 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.522064 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.522330 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.523006 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.526578 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.527940 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.528483 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.528847 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.535952 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.536681 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.536708 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.537010 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.537650 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.538033 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-52czq"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.538422 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.538799 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.540478 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.540839 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.540945 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.540973 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.541941 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ltv8d"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.541978 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bhzpg"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.541996 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f47nt"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.542028 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.542493 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-997wv"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.542597 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.542949 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.554525 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fthsz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.554869 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.558896 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559264 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559290 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559302 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559314 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-d6wkv"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559430 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.559472 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560238 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560282 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560294 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560371 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560296 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560464 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560482 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560493 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.560506 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jcxjn"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561062 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mzkkz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561086 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561098 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-snptm"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561806 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzvpl\" (UniqueName: \"kubernetes.io/projected/ff2ea387-8171-4259-ac56-f864a65f105f-kube-api-access-fzvpl\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561896 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561911 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.561998 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8pn\" (UniqueName: \"kubernetes.io/projected/0387a8b9-8804-48b5-8503-1734f2a15b45-kube-api-access-7x8pn\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562081 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562109 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562133 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-config\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562159 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh25q\" (UniqueName: \"kubernetes.io/projected/327b6d28-9130-4476-b8f2-edaf08da45ae-kube-api-access-mh25q\") pod \"downloads-7954f5f757-bhzpg\" (UID: \"327b6d28-9130-4476-b8f2-edaf08da45ae\") " pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562201 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5srr\" (UniqueName: \"kubernetes.io/projected/58b6a424-7606-420a-802d-1886adaa3e3d-kube-api-access-d5srr\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562221 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-config\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562241 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47w44\" (UniqueName: \"kubernetes.io/projected/b7308dad-19a5-4675-9874-ee0a814d8aed-kube-api-access-47w44\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562266 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562308 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562337 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-config\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562359 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-etcd-serving-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562383 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562462 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2ea387-8171-4259-ac56-f864a65f105f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562485 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwq5\" (UniqueName: \"kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562506 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562526 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562544 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562563 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562582 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562598 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562602 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562630 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7308dad-19a5-4675-9874-ee0a814d8aed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562673 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562693 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562715 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562754 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562767 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562789 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-serving-cert\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562810 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0964a12e-7b75-401b-9547-49e5a924ef0b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562848 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0964a12e-7b75-401b-9547-49e5a924ef0b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562867 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562891 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562930 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562950 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pmxj\" (UniqueName: \"kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562967 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.562987 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-images\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563026 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563044 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563064 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2ea387-8171-4259-ac56-f864a65f105f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563121 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7445\" (UniqueName: \"kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563142 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563183 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f99a811-543c-4b99-a394-9d941401efff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563204 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dglzr\" (UniqueName: \"kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563221 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-encryption-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563260 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6vvh\" (UniqueName: \"kubernetes.io/projected/e714667b-061d-4127-8dd3-47e403ebe079-kube-api-access-n6vvh\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563284 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e39402-3faf-4b34-a252-e4db0ac90909-serving-cert\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563301 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563342 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnfxk\" (UniqueName: \"kubernetes.io/projected/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-kube-api-access-hnfxk\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563367 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-service-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563383 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dfee5c28-dc5e-4f85-87d2-29925eeff49d-machine-approver-tls\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563401 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-etcd-client\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563432 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563449 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-serving-cert\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563470 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfl88\" (UniqueName: \"kubernetes.io/projected/85e39402-3faf-4b34-a252-e4db0ac90909-kube-api-access-kfl88\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563487 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-auth-proxy-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563503 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-audit-dir\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563543 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2m9v\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-kube-api-access-l2m9v\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563562 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563587 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563631 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563651 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bnhf\" (UniqueName: \"kubernetes.io/projected/6f99a811-543c-4b99-a394-9d941401efff-kube-api-access-7bnhf\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563669 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0387a8b9-8804-48b5-8503-1734f2a15b45-serving-cert\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563709 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563708 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563730 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-audit\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563886 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-client\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563916 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563927 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2nl8"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563940 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-encryption-config\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563960 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.563984 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564000 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-audit-policies\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564019 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564038 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-node-pullsecrets\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564062 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564130 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b6a424-7606-420a-802d-1886adaa3e3d-audit-dir\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564150 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-image-import-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564174 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-trusted-ca\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564194 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564215 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6m8\" (UniqueName: \"kubernetes.io/projected/dfee5c28-dc5e-4f85-87d2-29925eeff49d-kube-api-access-zt6m8\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564235 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564471 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-audit\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.564742 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.568103 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bszj2"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.568140 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.568153 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lm5vg"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.568360 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-node-pullsecrets\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.569314 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.569398 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-image-import-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.570404 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.571283 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.571296 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.572032 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.572092 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.572108 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fthsz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.572713 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-config\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.573472 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.574009 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-client\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.574629 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.575235 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.575733 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.586878 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58b6a424-7606-420a-802d-1886adaa3e3d-audit-dir\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.588744 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.589075 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-encryption-config\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.589268 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.589499 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-images\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.589783 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.590525 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.591146 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.591776 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f99a811-543c-4b99-a394-9d941401efff-config\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.591905 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2ea387-8171-4259-ac56-f864a65f105f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.593209 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.593837 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-service-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.594370 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.594631 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.594984 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-config\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.595109 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.596346 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.596475 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.596842 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.596936 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.597098 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0387a8b9-8804-48b5-8503-1734f2a15b45-trusted-ca\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.597276 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0964a12e-7b75-401b-9547-49e5a924ef0b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.597367 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.595636 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.597862 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dfee5c28-dc5e-4f85-87d2-29925eeff49d-auth-proxy-config\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.597837 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.598330 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.598379 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.598543 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.598725 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.598982 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-serving-cert\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.599037 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e714667b-061d-4127-8dd3-47e403ebe079-etcd-serving-ca\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.599093 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b2mv7"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.599282 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.599397 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.600507 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58b6a424-7606-420a-802d-1886adaa3e3d-audit-policies\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.600542 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.600803 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e714667b-061d-4127-8dd3-47e403ebe079-audit-dir\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.600790 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f99a811-543c-4b99-a394-9d941401efff-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.601539 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.602540 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85e39402-3faf-4b34-a252-e4db0ac90909-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.602671 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-etcd-client\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.602762 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58b6a424-7606-420a-802d-1886adaa3e3d-serving-cert\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.603513 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e39402-3faf-4b34-a252-e4db0ac90909-serving-cert\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.604357 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dfee5c28-dc5e-4f85-87d2-29925eeff49d-machine-approver-tls\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.604756 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0964a12e-7b75-401b-9547-49e5a924ef0b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.604882 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7308dad-19a5-4675-9874-ee0a814d8aed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.605242 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e714667b-061d-4127-8dd3-47e403ebe079-encryption-config\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.605303 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qb2lj"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.605313 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.606741 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2ea387-8171-4259-ac56-f864a65f105f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.607798 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.607910 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jcxjn"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.609528 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.609927 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.610897 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0387a8b9-8804-48b5-8503-1734f2a15b45-serving-cert\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.610948 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.611958 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.613087 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.614134 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-snptm"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.615157 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.616210 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.617289 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.617964 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.618318 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-52czq"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.619248 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f47nt"] Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.637290 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.657499 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.679526 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.697693 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.717308 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.737269 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.757108 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.778383 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.797611 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.817144 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.837836 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.857074 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.877207 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.897595 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.917593 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.938307 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.957840 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.978084 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 09 13:51:27 crc kubenswrapper[4902]: I1009 13:51:27.997281 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.018157 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.037875 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.068001 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.077856 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.097653 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.116990 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.137801 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.158270 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.177672 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.197396 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.223617 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.237466 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.259352 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.278384 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.296955 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.318260 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.338968 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.358959 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.378556 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.397807 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.437727 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.458801 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.477987 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.499205 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.516608 4902 request.go:700] Waited for 1.000718232s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.519145 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.537938 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.558756 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.578059 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.598569 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.637939 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.657778 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.679612 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.698308 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.718370 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.738897 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.757827 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.778176 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.799147 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.819659 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.838894 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.858292 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.877993 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.898180 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.918515 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.938728 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.958020 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.979852 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 13:51:28 crc kubenswrapper[4902]: I1009 13:51:28.998428 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.017184 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.038280 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.057602 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.077457 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.098470 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.117776 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.137617 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.157468 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.177919 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.185493 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:29 crc kubenswrapper[4902]: E1009 13:51:29.185761 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:45.185715409 +0000 UTC m=+52.383574483 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.186139 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.186185 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.187345 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.194132 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.197506 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.227524 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.238497 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.258374 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.278206 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.287718 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.287781 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.297520 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.298333 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.298471 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.304766 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.317706 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.330159 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.338657 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.356006 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.357774 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.378629 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.405190 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.436296 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzvpl\" (UniqueName: \"kubernetes.io/projected/ff2ea387-8171-4259-ac56-f864a65f105f-kube-api-access-fzvpl\") pod \"openshift-apiserver-operator-796bbdcf4f-8l7cj\" (UID: \"ff2ea387-8171-4259-ac56-f864a65f105f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.438437 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.458416 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.495345 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8pn\" (UniqueName: \"kubernetes.io/projected/0387a8b9-8804-48b5-8503-1734f2a15b45-kube-api-access-7x8pn\") pod \"console-operator-58897d9998-ltv8d\" (UID: \"0387a8b9-8804-48b5-8503-1734f2a15b45\") " pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.495661 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.498906 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.517938 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.536353 4902 request.go:700] Waited for 1.971400513s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.544361 4902 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 09 13:51:29 crc kubenswrapper[4902]: W1009 13:51:29.544779 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c4cbb394f3809e59ce95f503add434a50c7c8f13b72c430b09cd299f4c62fca0 WatchSource:0}: Error finding container c4cbb394f3809e59ce95f503add434a50c7c8f13b72c430b09cd299f4c62fca0: Status 404 returned error can't find the container with id c4cbb394f3809e59ce95f503add434a50c7c8f13b72c430b09cd299f4c62fca0 Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.558042 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 09 13:51:29 crc kubenswrapper[4902]: W1009 13:51:29.567371 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-84a72dff2a6e36900e767b16de10bb5779ac56612ae73d8718b28118f6c6213f WatchSource:0}: Error finding container 84a72dff2a6e36900e767b16de10bb5779ac56612ae73d8718b28118f6c6213f: Status 404 returned error can't find the container with id 84a72dff2a6e36900e767b16de10bb5779ac56612ae73d8718b28118f6c6213f Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.601206 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh25q\" (UniqueName: \"kubernetes.io/projected/327b6d28-9130-4476-b8f2-edaf08da45ae-kube-api-access-mh25q\") pod \"downloads-7954f5f757-bhzpg\" (UID: \"327b6d28-9130-4476-b8f2-edaf08da45ae\") " pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.649383 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pmxj\" (UniqueName: \"kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj\") pod \"console-f9d7485db-d5zks\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.650651 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.654780 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5srr\" (UniqueName: \"kubernetes.io/projected/58b6a424-7606-420a-802d-1886adaa3e3d-kube-api-access-d5srr\") pod \"apiserver-7bbb656c7d-4cxxv\" (UID: \"58b6a424-7606-420a-802d-1886adaa3e3d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:29 crc kubenswrapper[4902]: W1009 13:51:29.659693 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-8641c98263aaf2f90dafcf23a40dc760c87aa1135ba12cf152ea381cb9710025 WatchSource:0}: Error finding container 8641c98263aaf2f90dafcf23a40dc760c87aa1135ba12cf152ea381cb9710025: Status 404 returned error can't find the container with id 8641c98263aaf2f90dafcf23a40dc760c87aa1135ba12cf152ea381cb9710025 Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.666034 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47w44\" (UniqueName: \"kubernetes.io/projected/b7308dad-19a5-4675-9874-ee0a814d8aed-kube-api-access-47w44\") pod \"cluster-samples-operator-665b6dd947-4ffxb\" (UID: \"b7308dad-19a5-4675-9874-ee0a814d8aed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.688340 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6m8\" (UniqueName: \"kubernetes.io/projected/dfee5c28-dc5e-4f85-87d2-29925eeff49d-kube-api-access-zt6m8\") pod \"machine-approver-56656f9798-rq4g9\" (UID: \"dfee5c28-dc5e-4f85-87d2-29925eeff49d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.698657 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglzr\" (UniqueName: \"kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr\") pod \"route-controller-manager-6576b87f9c-ttk5q\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.713687 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6vvh\" (UniqueName: \"kubernetes.io/projected/e714667b-061d-4127-8dd3-47e403ebe079-kube-api-access-n6vvh\") pod \"apiserver-76f77b778f-mzkkz\" (UID: \"e714667b-061d-4127-8dd3-47e403ebe079\") " pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.741419 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.743698 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.746584 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.756133 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.759738 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7445\" (UniqueName: \"kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445\") pod \"controller-manager-879f6c89f-6rzkc\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.763667 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.773791 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfl88\" (UniqueName: \"kubernetes.io/projected/85e39402-3faf-4b34-a252-e4db0ac90909-kube-api-access-kfl88\") pod \"authentication-operator-69f744f599-lm5vg\" (UID: \"85e39402-3faf-4b34-a252-e4db0ac90909\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.788036 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.790021 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj"] Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.792358 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.795505 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bnhf\" (UniqueName: \"kubernetes.io/projected/6f99a811-543c-4b99-a394-9d941401efff-kube-api-access-7bnhf\") pod \"machine-api-operator-5694c8668f-bszj2\" (UID: \"6f99a811-543c-4b99-a394-9d941401efff\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.812146 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnfxk\" (UniqueName: \"kubernetes.io/projected/2425c7bf-9eeb-4255-bfd0-1d8ef07d835b-kube-api-access-hnfxk\") pod \"openshift-config-operator-7777fb866f-nhhpn\" (UID: \"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.833946 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwq5\" (UniqueName: \"kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5\") pod \"oauth-openshift-558db77b4-jw7l8\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.839077 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.857991 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2m9v\" (UniqueName: \"kubernetes.io/projected/0964a12e-7b75-401b-9547-49e5a924ef0b-kube-api-access-l2m9v\") pod \"cluster-image-registry-operator-dc59b4c8b-vdsh5\" (UID: \"0964a12e-7b75-401b-9547-49e5a924ef0b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.864458 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.866379 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" event={"ID":"ff2ea387-8171-4259-ac56-f864a65f105f","Type":"ContainerStarted","Data":"769259e2d8ef5c71e34b39a774a036c262c202d1ab8f6ffcaf49f55bc690973a"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.866459 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ltv8d"] Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.868049 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"700d1ae12e7d48e3cb31de7169543324efd385dad6ab6f2d4351386718a14bc7"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.868105 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"84a72dff2a6e36900e767b16de10bb5779ac56612ae73d8718b28118f6c6213f"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.868354 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.880288 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"76622bd2415e50ce5e2db0957245243a055c522e8309f557b6cdde66b31ace0d"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.880344 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8641c98263aaf2f90dafcf23a40dc760c87aa1135ba12cf152ea381cb9710025"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.885016 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.899991 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9cb7b0c6864dd51bb1a47854726c1ea90a8ecbe67c22fb055bacde97bd38861c"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.900048 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c4cbb394f3809e59ce95f503add434a50c7c8f13b72c430b09cd299f4c62fca0"} Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.901914 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3c809f-9893-4b55-bb22-759885ab8a31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.901945 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.901968 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902012 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8b3f1d0-56c3-4592-9346-4282c209c0b2-proxy-tls\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902036 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jd79\" (UniqueName: \"kubernetes.io/projected/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-kube-api-access-6jd79\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902070 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-client\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902094 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902115 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902139 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvsl\" (UniqueName: \"kubernetes.io/projected/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-kube-api-access-mfvsl\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902186 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-default-certificate\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902207 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7mk\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902229 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902263 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-service-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902298 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-config\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902364 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-serving-cert\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902384 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902458 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902482 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/814706d7-474c-4b19-a3c2-5785ae74b60c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902513 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22155c67-aed1-4a35-989a-2777d69b3ee5-trusted-ca\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902531 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-metrics-certs\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902550 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltgvf\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-kube-api-access-ltgvf\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902572 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6d59de9-a767-42f2-ae5d-e31667c52966-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902622 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x9qx\" (UniqueName: \"kubernetes.io/projected/e8b3f1d0-56c3-4592-9346-4282c209c0b2-kube-api-access-9x9qx\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902649 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814706d7-474c-4b19-a3c2-5785ae74b60c-config\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902714 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mwhf\" (UniqueName: \"kubernetes.io/projected/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-kube-api-access-2mwhf\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902737 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw7z5\" (UniqueName: \"kubernetes.io/projected/07f980b9-45c2-48c2-a84e-225ca4b82a77-kube-api-access-bw7z5\") pod \"migrator-59844c95c7-jgbgc\" (UID: \"07f980b9-45c2-48c2-a84e-225ca4b82a77\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902793 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kf5b\" (UniqueName: \"kubernetes.io/projected/36774cc0-04ea-4cab-9385-e9dc295cf63f-kube-api-access-2kf5b\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902892 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.902964 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-stats-auth\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:29 crc kubenswrapper[4902]: E1009 13:51:29.906125 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.406107146 +0000 UTC m=+37.603966210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.906708 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.906948 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d59de9-a767-42f2-ae5d-e31667c52966-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.906987 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.908332 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36774cc0-04ea-4cab-9385-e9dc295cf63f-metrics-tls\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.909380 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.909459 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-images\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.909690 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22155c67-aed1-4a35-989a-2777d69b3ee5-metrics-tls\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.909803 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.909905 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3c809f-9893-4b55-bb22-759885ab8a31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.910059 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.910440 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/814706d7-474c-4b19-a3c2-5785ae74b60c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.910498 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-service-ca-bundle\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.910529 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d59de9-a767-42f2-ae5d-e31667c52966-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.910995 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.911817 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3c809f-9893-4b55-bb22-759885ab8a31-config\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.911856 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnx82\" (UniqueName: \"kubernetes.io/projected/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-kube-api-access-mnx82\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.912802 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" Oct 09 13:51:29 crc kubenswrapper[4902]: I1009 13:51:29.965901 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024454 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024651 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024673 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-node-bootstrap-token\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024756 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-config\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024776 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ae0ea3-4302-4125-aee7-b6ee8276a000-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024795 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmk2\" (UniqueName: \"kubernetes.io/projected/71406502-8050-4f28-bc5f-b5bbeeafd52f-kube-api-access-8fmk2\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024814 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024840 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-serving-cert\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024857 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024874 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c13d916-e588-4488-af07-82d5990cba9e-metrics-tls\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024899 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/814706d7-474c-4b19-a3c2-5785ae74b60c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024916 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-plugins-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024935 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22155c67-aed1-4a35-989a-2777d69b3ee5-trusted-ca\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024970 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-metrics-certs\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.024987 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltgvf\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-kube-api-access-ltgvf\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.025002 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-srv-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.025421 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.525400902 +0000 UTC m=+37.723259966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.026925 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-config\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027289 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6d59de9-a767-42f2-ae5d-e31667c52966-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027321 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96k49\" (UniqueName: \"kubernetes.io/projected/8621111f-521a-48d9-886b-20299f771b70-kube-api-access-96k49\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027363 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-mountpoint-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027394 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4251505b-3c9f-4b37-8805-8d984504a89e-cert\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027443 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x9qx\" (UniqueName: \"kubernetes.io/projected/e8b3f1d0-56c3-4592-9346-4282c209c0b2-kube-api-access-9x9qx\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027459 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814706d7-474c-4b19-a3c2-5785ae74b60c-config\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027477 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8b8\" (UniqueName: \"kubernetes.io/projected/4bc89116-ce17-406a-9f82-b40535555c7f-kube-api-access-9z8b8\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027524 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff922ea-aac7-4128-b619-658b4f44ce6e-config\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027543 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbtb\" (UniqueName: \"kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027565 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fdr9\" (UniqueName: \"kubernetes.io/projected/534001de-19e9-45bd-b2fe-42b9521447a0-kube-api-access-6fdr9\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027589 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw7z5\" (UniqueName: \"kubernetes.io/projected/07f980b9-45c2-48c2-a84e-225ca4b82a77-kube-api-access-bw7z5\") pod \"migrator-59844c95c7-jgbgc\" (UID: \"07f980b9-45c2-48c2-a84e-225ca4b82a77\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027640 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mwhf\" (UniqueName: \"kubernetes.io/projected/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-kube-api-access-2mwhf\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027672 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kf5b\" (UniqueName: \"kubernetes.io/projected/36774cc0-04ea-4cab-9385-e9dc295cf63f-kube-api-access-2kf5b\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027690 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g72b\" (UniqueName: \"kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027704 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bc89116-ce17-406a-9f82-b40535555c7f-tmpfs\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027726 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027744 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-stats-auth\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027764 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027784 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027884 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/534001de-19e9-45bd-b2fe-42b9521447a0-signing-key\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027903 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxlz\" (UniqueName: \"kubernetes.io/projected/70af937a-5c69-4893-a069-9d968f1b1b9c-kube-api-access-vbxlz\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027919 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027934 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-webhook-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027966 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.027982 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d59de9-a767-42f2-ae5d-e31667c52966-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028000 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028020 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36774cc0-04ea-4cab-9385-e9dc295cf63f-metrics-tls\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028065 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kb6\" (UniqueName: \"kubernetes.io/projected/84438d18-f42f-44cd-8129-7dfd9edfed87-kube-api-access-f5kb6\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028084 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-certs\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028102 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028119 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-images\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028135 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslqr\" (UniqueName: \"kubernetes.io/projected/3c2d5abc-8893-4dda-b7a9-1de1560e9381-kube-api-access-cslqr\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028154 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssmw\" (UniqueName: \"kubernetes.io/projected/b5ae0ea3-4302-4125-aee7-b6ee8276a000-kube-api-access-gssmw\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028174 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22155c67-aed1-4a35-989a-2777d69b3ee5-metrics-tls\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028192 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028209 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c13d916-e588-4488-af07-82d5990cba9e-config-volume\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028225 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29vg\" (UniqueName: \"kubernetes.io/projected/4ff922ea-aac7-4128-b619-658b4f44ce6e-kube-api-access-q29vg\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028243 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3c809f-9893-4b55-bb22-759885ab8a31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028260 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028276 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c2d5abc-8893-4dda-b7a9-1de1560e9381-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028293 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-registration-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028321 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d59de9-a767-42f2-ae5d-e31667c52966-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028336 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/534001de-19e9-45bd-b2fe-42b9521447a0-signing-cabundle\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028353 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ae518f0-243e-4916-89cb-0e621793d4db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028383 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/814706d7-474c-4b19-a3c2-5785ae74b60c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028398 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-service-ca-bundle\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028419 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028491 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028514 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028532 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8d5\" (UniqueName: \"kubernetes.io/projected/4c13d916-e588-4488-af07-82d5990cba9e-kube-api-access-dr8d5\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028560 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3c809f-9893-4b55-bb22-759885ab8a31-config\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028575 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnx82\" (UniqueName: \"kubernetes.io/projected/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-kube-api-access-mnx82\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028592 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8621111f-521a-48d9-886b-20299f771b70-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028610 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3c809f-9893-4b55-bb22-759885ab8a31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028640 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028657 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028675 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8b3f1d0-56c3-4592-9346-4282c209c0b2-proxy-tls\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028691 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-srv-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028709 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jd79\" (UniqueName: \"kubernetes.io/projected/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-kube-api-access-6jd79\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028724 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028741 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrqj\" (UniqueName: \"kubernetes.io/projected/7ae518f0-243e-4916-89cb-0e621793d4db-kube-api-access-lmrqj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028758 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-csi-data-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.028776 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.029916 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-client\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.029942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff922ea-aac7-4128-b619-658b4f44ce6e-serving-cert\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.029963 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-socket-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.029982 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.030005 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.030025 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvsl\" (UniqueName: \"kubernetes.io/projected/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-kube-api-access-mfvsl\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.030053 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpscf\" (UniqueName: \"kubernetes.io/projected/4251505b-3c9f-4b37-8805-8d984504a89e-kube-api-access-gpscf\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.030071 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8621111f-521a-48d9-886b-20299f771b70-proxy-tls\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.037719 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-default-certificate\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.037874 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/814706d7-474c-4b19-a3c2-5785ae74b60c-config\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.038002 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.039508 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-serving-cert\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.042085 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/814706d7-474c-4b19-a3c2-5785ae74b60c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.047393 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-images\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.048888 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.049569 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.051243 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-service-ca-bundle\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.052851 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8b3f1d0-56c3-4592-9346-4282c209c0b2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.056023 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d59de9-a767-42f2-ae5d-e31667c52966-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.057185 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.058099 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-stats-auth\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.058851 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.061292 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.061760 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22155c67-aed1-4a35-989a-2777d69b3ee5-trusted-ca\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.065154 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-metrics-certs\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.066353 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7mk\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.066398 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7jc\" (UniqueName: \"kubernetes.io/projected/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-kube-api-access-ss7jc\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.066473 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.070308 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8b3f1d0-56c3-4592-9346-4282c209c0b2-proxy-tls\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.073051 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-client\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.077012 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kf5b\" (UniqueName: \"kubernetes.io/projected/36774cc0-04ea-4cab-9385-e9dc295cf63f-kube-api-access-2kf5b\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.078467 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.078993 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.079953 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-service-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.084434 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.080272 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d3c809f-9893-4b55-bb22-759885ab8a31-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.080609 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-etcd-service-ca\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.082654 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.085424 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3c809f-9893-4b55-bb22-759885ab8a31-config\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.087871 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.087987 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22155c67-aed1-4a35-989a-2777d69b3ee5-metrics-tls\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.089674 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d59de9-a767-42f2-ae5d-e31667c52966-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.090821 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/36774cc0-04ea-4cab-9385-e9dc295cf63f-metrics-tls\") pod \"dns-operator-744455d44c-n2nl8\" (UID: \"36774cc0-04ea-4cab-9385-e9dc295cf63f\") " pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.095122 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mwhf\" (UniqueName: \"kubernetes.io/projected/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-kube-api-access-2mwhf\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.102231 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bde9070e-6d53-4cbe-9542-4dc5ba33a16d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xd6rl\" (UID: \"bde9070e-6d53-4cbe-9542-4dc5ba33a16d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.103944 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-default-certificate\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.108803 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.109053 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.112186 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.114958 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.118472 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.133041 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6d59de9-a767-42f2-ae5d-e31667c52966-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8nknx\" (UID: \"d6d59de9-a767-42f2-ae5d-e31667c52966\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.137634 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d3c809f-9893-4b55-bb22-759885ab8a31-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kkvj6\" (UID: \"4d3c809f-9893-4b55-bb22-759885ab8a31\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.152154 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.159534 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x9qx\" (UniqueName: \"kubernetes.io/projected/e8b3f1d0-56c3-4592-9346-4282c209c0b2-kube-api-access-9x9qx\") pod \"machine-config-operator-74547568cd-qs2nx\" (UID: \"e8b3f1d0-56c3-4592-9346-4282c209c0b2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.165315 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.174933 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/814706d7-474c-4b19-a3c2-5785ae74b60c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8db2k\" (UID: \"814706d7-474c-4b19-a3c2-5785ae74b60c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.187989 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff922ea-aac7-4128-b619-658b4f44ce6e-serving-cert\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188021 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-socket-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188053 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188072 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpscf\" (UniqueName: \"kubernetes.io/projected/4251505b-3c9f-4b37-8805-8d984504a89e-kube-api-access-gpscf\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188092 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8621111f-521a-48d9-886b-20299f771b70-proxy-tls\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188139 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7jc\" (UniqueName: \"kubernetes.io/projected/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-kube-api-access-ss7jc\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188166 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188188 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188206 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-node-bootstrap-token\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188228 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188250 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ae0ea3-4302-4125-aee7-b6ee8276a000-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188276 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fmk2\" (UniqueName: \"kubernetes.io/projected/71406502-8050-4f28-bc5f-b5bbeeafd52f-kube-api-access-8fmk2\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188298 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188326 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188351 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c13d916-e588-4488-af07-82d5990cba9e-metrics-tls\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188371 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-plugins-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188390 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-srv-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188417 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96k49\" (UniqueName: \"kubernetes.io/projected/8621111f-521a-48d9-886b-20299f771b70-kube-api-access-96k49\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188477 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-mountpoint-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188500 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4251505b-3c9f-4b37-8805-8d984504a89e-cert\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188518 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8b8\" (UniqueName: \"kubernetes.io/projected/4bc89116-ce17-406a-9f82-b40535555c7f-kube-api-access-9z8b8\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188548 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff922ea-aac7-4128-b619-658b4f44ce6e-config\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188567 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbtb\" (UniqueName: \"kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188585 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fdr9\" (UniqueName: \"kubernetes.io/projected/534001de-19e9-45bd-b2fe-42b9521447a0-kube-api-access-6fdr9\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188615 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g72b\" (UniqueName: \"kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188633 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bc89116-ce17-406a-9f82-b40535555c7f-tmpfs\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188652 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188670 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/534001de-19e9-45bd-b2fe-42b9521447a0-signing-key\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188688 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188704 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-webhook-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188720 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxlz\" (UniqueName: \"kubernetes.io/projected/70af937a-5c69-4893-a069-9d968f1b1b9c-kube-api-access-vbxlz\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188738 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188760 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kb6\" (UniqueName: \"kubernetes.io/projected/84438d18-f42f-44cd-8129-7dfd9edfed87-kube-api-access-f5kb6\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188779 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-certs\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188798 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslqr\" (UniqueName: \"kubernetes.io/projected/3c2d5abc-8893-4dda-b7a9-1de1560e9381-kube-api-access-cslqr\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188813 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gssmw\" (UniqueName: \"kubernetes.io/projected/b5ae0ea3-4302-4125-aee7-b6ee8276a000-kube-api-access-gssmw\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188828 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c13d916-e588-4488-af07-82d5990cba9e-config-volume\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188846 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29vg\" (UniqueName: \"kubernetes.io/projected/4ff922ea-aac7-4128-b619-658b4f44ce6e-kube-api-access-q29vg\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188878 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c2d5abc-8893-4dda-b7a9-1de1560e9381-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188893 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-registration-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188917 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/534001de-19e9-45bd-b2fe-42b9521447a0-signing-cabundle\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188941 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ae518f0-243e-4916-89cb-0e621793d4db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188961 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.188985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8d5\" (UniqueName: \"kubernetes.io/projected/4c13d916-e588-4488-af07-82d5990cba9e-kube-api-access-dr8d5\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189001 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189026 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8621111f-521a-48d9-886b-20299f771b70-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189050 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-srv-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189066 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189083 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrqj\" (UniqueName: \"kubernetes.io/projected/7ae518f0-243e-4916-89cb-0e621793d4db-kube-api-access-lmrqj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189107 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-csi-data-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.189252 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-csi-data-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.190015 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-socket-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.190148 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-mountpoint-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.190014 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-plugins-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.190927 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ff922ea-aac7-4128-b619-658b4f44ce6e-config\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.191057 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.191166 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4bc89116-ce17-406a-9f82-b40535555c7f-tmpfs\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.191486 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.191909 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/71406502-8050-4f28-bc5f-b5bbeeafd52f-registration-dir\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.192981 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.193510 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/534001de-19e9-45bd-b2fe-42b9521447a0-signing-cabundle\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.193584 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.198265 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-webhook-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.199230 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8621111f-521a-48d9-886b-20299f771b70-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.201189 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.201221 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.701187742 +0000 UTC m=+37.899046806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.201647 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c13d916-e588-4488-af07-82d5990cba9e-config-volume\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.202204 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ae518f0-243e-4916-89cb-0e621793d4db-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.202620 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.213329 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.214251 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-node-bootstrap-token\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.216148 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-srv-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.218284 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c2d5abc-8893-4dda-b7a9-1de1560e9381-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.221651 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ff922ea-aac7-4128-b619-658b4f44ce6e-serving-cert\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.224346 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/84438d18-f42f-44cd-8129-7dfd9edfed87-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.228668 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8621111f-521a-48d9-886b-20299f771b70-proxy-tls\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.229102 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ae0ea3-4302-4125-aee7-b6ee8276a000-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.229351 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/534001de-19e9-45bd-b2fe-42b9521447a0-signing-key\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.229837 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4251505b-3c9f-4b37-8805-8d984504a89e-cert\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.232019 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4bc89116-ce17-406a-9f82-b40535555c7f-apiservice-cert\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.232868 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltgvf\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-kube-api-access-ltgvf\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.233295 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.233649 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw7z5\" (UniqueName: \"kubernetes.io/projected/07f980b9-45c2-48c2-a84e-225ca4b82a77-kube-api-access-bw7z5\") pod \"migrator-59844c95c7-jgbgc\" (UID: \"07f980b9-45c2-48c2-a84e-225ca4b82a77\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.234145 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/70af937a-5c69-4893-a069-9d968f1b1b9c-srv-cert\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.234183 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-certs\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.237258 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jd79\" (UniqueName: \"kubernetes.io/projected/7ad909d3-8d75-49d3-83c1-3ef15b47d08d-kube-api-access-6jd79\") pod \"kube-storage-version-migrator-operator-b67b599dd-45ck4\" (UID: \"7ad909d3-8d75-49d3-83c1-3ef15b47d08d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.261262 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.276776 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c13d916-e588-4488-af07-82d5990cba9e-metrics-tls\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.292129 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.292570 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.792546108 +0000 UTC m=+37.990405172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.292656 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.293798 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22155c67-aed1-4a35-989a-2777d69b3ee5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-fsvms\" (UID: \"22155c67-aed1-4a35-989a-2777d69b3ee5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.296641 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnx82\" (UniqueName: \"kubernetes.io/projected/1fd89f10-e85a-4a04-9483-cbf8152d1ab5-kube-api-access-mnx82\") pod \"etcd-operator-b45778765-b2mv7\" (UID: \"1fd89f10-e85a-4a04-9483-cbf8152d1ab5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.302458 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.802433898 +0000 UTC m=+38.000292962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.327821 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.331524 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.338541 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7mk\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.358163 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvsl\" (UniqueName: \"kubernetes.io/projected/6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c-kube-api-access-mfvsl\") pod \"router-default-5444994796-qhd5t\" (UID: \"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c\") " pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.376888 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lm5vg"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.392858 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bhzpg"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.394529 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mzkkz"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.395044 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.395824 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8b8\" (UniqueName: \"kubernetes.io/projected/4bc89116-ce17-406a-9f82-b40535555c7f-kube-api-access-9z8b8\") pod \"packageserver-d55dfcdfc-th8gh\" (UID: \"4bc89116-ce17-406a-9f82-b40535555c7f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.401026 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpscf\" (UniqueName: \"kubernetes.io/projected/4251505b-3c9f-4b37-8805-8d984504a89e-kube-api-access-gpscf\") pod \"ingress-canary-fthsz\" (UID: \"4251505b-3c9f-4b37-8805-8d984504a89e\") " pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.403491 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.404089 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:30.904070996 +0000 UTC m=+38.101930060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.414926 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29vg\" (UniqueName: \"kubernetes.io/projected/4ff922ea-aac7-4128-b619-658b4f44ce6e-kube-api-access-q29vg\") pod \"service-ca-operator-777779d784-52czq\" (UID: \"4ff922ea-aac7-4128-b619-658b4f44ce6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.426328 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.434071 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.435251 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7jc\" (UniqueName: \"kubernetes.io/projected/4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b-kube-api-access-ss7jc\") pod \"machine-config-server-997wv\" (UID: \"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b\") " pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.441906 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.462007 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.470552 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.470861 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fdr9\" (UniqueName: \"kubernetes.io/projected/534001de-19e9-45bd-b2fe-42b9521447a0-kube-api-access-6fdr9\") pod \"service-ca-9c57cc56f-f47nt\" (UID: \"534001de-19e9-45bd-b2fe-42b9521447a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.473916 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.484473 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8d5\" (UniqueName: \"kubernetes.io/projected/4c13d916-e588-4488-af07-82d5990cba9e-kube-api-access-dr8d5\") pod \"dns-default-jcxjn\" (UID: \"4c13d916-e588-4488-af07-82d5990cba9e\") " pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.499442 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.505760 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.506128 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.006114896 +0000 UTC m=+38.203973960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.514881 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kb6\" (UniqueName: \"kubernetes.io/projected/84438d18-f42f-44cd-8129-7dfd9edfed87-kube-api-access-f5kb6\") pod \"olm-operator-6b444d44fb-7wjtm\" (UID: \"84438d18-f42f-44cd-8129-7dfd9edfed87\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.519005 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbtb\" (UniqueName: \"kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb\") pod \"marketplace-operator-79b997595-qvwhz\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.533353 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.539032 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.552674 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g72b\" (UniqueName: \"kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b\") pod \"cni-sysctl-allowlist-ds-d6wkv\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.567124 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxlz\" (UniqueName: \"kubernetes.io/projected/70af937a-5c69-4893-a069-9d968f1b1b9c-kube-api-access-vbxlz\") pod \"catalog-operator-68c6474976-4rdw2\" (UID: \"70af937a-5c69-4893-a069-9d968f1b1b9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.580921 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.587079 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.587356 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96k49\" (UniqueName: \"kubernetes.io/projected/8621111f-521a-48d9-886b-20299f771b70-kube-api-access-96k49\") pod \"machine-config-controller-84d6567774-kxnmp\" (UID: \"8621111f-521a-48d9-886b-20299f771b70\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.595387 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.600744 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-997wv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.601414 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fmk2\" (UniqueName: \"kubernetes.io/projected/71406502-8050-4f28-bc5f-b5bbeeafd52f-kube-api-access-8fmk2\") pod \"csi-hostpathplugin-snptm\" (UID: \"71406502-8050-4f28-bc5f-b5bbeeafd52f\") " pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.607594 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.608588 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:30 crc kubenswrapper[4902]: W1009 13:51:30.609194 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e39402_3faf_4b34_a252_e4db0ac90909.slice/crio-59d49dbf665fa158f91d50e89347bc2e37ec2e784bb7796732941130d60392e2 WatchSource:0}: Error finding container 59d49dbf665fa158f91d50e89347bc2e37ec2e784bb7796732941130d60392e2: Status 404 returned error can't find the container with id 59d49dbf665fa158f91d50e89347bc2e37ec2e784bb7796732941130d60392e2 Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.610770 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.110742131 +0000 UTC m=+38.308601215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.610999 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.611406 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.111394561 +0000 UTC m=+38.309253625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.614219 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fthsz" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.616536 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslqr\" (UniqueName: \"kubernetes.io/projected/3c2d5abc-8893-4dda-b7a9-1de1560e9381-kube-api-access-cslqr\") pod \"multus-admission-controller-857f4d67dd-qb2lj\" (UID: \"3c2d5abc-8893-4dda-b7a9-1de1560e9381\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.625959 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.632775 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.653852 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.654190 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-snptm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.655987 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bszj2"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.660012 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrqj\" (UniqueName: \"kubernetes.io/projected/7ae518f0-243e-4916-89cb-0e621793d4db-kube-api-access-lmrqj\") pod \"control-plane-machine-set-operator-78cbb6b69f-pqbtl\" (UID: \"7ae518f0-243e-4916-89cb-0e621793d4db\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.663387 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssmw\" (UniqueName: \"kubernetes.io/projected/b5ae0ea3-4302-4125-aee7-b6ee8276a000-kube-api-access-gssmw\") pod \"package-server-manager-789f6589d5-2224d\" (UID: \"b5ae0ea3-4302-4125-aee7-b6ee8276a000\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: W1009 13:51:30.684700 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1d9312_7008_48ff_9437_af995ef9b88d.slice/crio-e29a77e9c3f9c038472bd75fbbb248c31c1f4ed1dee2f5114ea639bc3f6353e0 WatchSource:0}: Error finding container e29a77e9c3f9c038472bd75fbbb248c31c1f4ed1dee2f5114ea639bc3f6353e0: Status 404 returned error can't find the container with id e29a77e9c3f9c038472bd75fbbb248c31c1f4ed1dee2f5114ea639bc3f6353e0 Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.687999 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs\") pod \"collect-profiles-29333625-8hwfc\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.712899 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.713325 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.213309477 +0000 UTC m=+38.411168551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.740734 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-n2nl8"] Oct 09 13:51:30 crc kubenswrapper[4902]: W1009 13:51:30.748469 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f99a811_543c_4b99_a394_9d941401efff.slice/crio-d36305fec150c2f18f3bde8a14c4eb23f6d98bfa1c67b1373eb748569084e7ce WatchSource:0}: Error finding container d36305fec150c2f18f3bde8a14c4eb23f6d98bfa1c67b1373eb748569084e7ce: Status 404 returned error can't find the container with id d36305fec150c2f18f3bde8a14c4eb23f6d98bfa1c67b1373eb748569084e7ce Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.782402 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.794617 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.797532 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.816710 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.817299 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.817606 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.317593852 +0000 UTC m=+38.515452916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.835676 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.848010 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.849772 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.862706 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.865984 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.878323 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" Oct 09 13:51:30 crc kubenswrapper[4902]: W1009 13:51:30.898307 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2425c7bf_9eeb_4255_bfd0_1d8ef07d835b.slice/crio-e0e11f7dbd7283a1a86225ccb42aadf2edf7b1000e7fbf75904411cc870755af WatchSource:0}: Error finding container e0e11f7dbd7283a1a86225ccb42aadf2edf7b1000e7fbf75904411cc870755af: Status 404 returned error can't find the container with id e0e11f7dbd7283a1a86225ccb42aadf2edf7b1000e7fbf75904411cc870755af Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.902138 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6"] Oct 09 13:51:30 crc kubenswrapper[4902]: W1009 13:51:30.911282 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36774cc0_04ea_4cab_9385_e9dc295cf63f.slice/crio-e64b2c4e013dc712d1292cf7cd7b5dfa4ccc2d660ba60e123fddaf6bc326582c WatchSource:0}: Error finding container e64b2c4e013dc712d1292cf7cd7b5dfa4ccc2d660ba60e123fddaf6bc326582c: Status 404 returned error can't find the container with id e64b2c4e013dc712d1292cf7cd7b5dfa4ccc2d660ba60e123fddaf6bc326582c Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.911906 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" event={"ID":"fd1d9312-7008-48ff-9437-af995ef9b88d","Type":"ContainerStarted","Data":"e29a77e9c3f9c038472bd75fbbb248c31c1f4ed1dee2f5114ea639bc3f6353e0"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.918909 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:30 crc kubenswrapper[4902]: E1009 13:51:30.919394 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.419374054 +0000 UTC m=+38.617233118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.919790 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl"] Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.922608 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" event={"ID":"0387a8b9-8804-48b5-8503-1734f2a15b45","Type":"ContainerStarted","Data":"26316f0ea4996f243f0609192158652712a7fc6d0ad264a038826bbbb03c2603"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.922649 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" event={"ID":"0387a8b9-8804-48b5-8503-1734f2a15b45","Type":"ContainerStarted","Data":"7e4c7cecca20b4422dfe0baea723bc5d2f7a66e8ce2277a11b90bb7f3506e8c5"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.923213 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.928657 4902 patch_prober.go:28] interesting pod/console-operator-58897d9998-ltv8d container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.928749 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" podUID="0387a8b9-8804-48b5-8503-1734f2a15b45" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.953960 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" event={"ID":"58b6a424-7606-420a-802d-1886adaa3e3d","Type":"ContainerStarted","Data":"3ec331aa095e6ef4b8b88178ef7c300d02ccced574a0f75205e79505e629767e"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.954018 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" event={"ID":"58b6a424-7606-420a-802d-1886adaa3e3d","Type":"ContainerStarted","Data":"1e62132b4ef63caf26c7fea7bbfaac890fbdc1de3c9093cb30b26eb8c0892f37"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.959217 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" event={"ID":"28116576-5069-4dd6-90f1-31582eda88df","Type":"ContainerStarted","Data":"9ec4731140976201e54c610e1fb018ac9a2ee063757bc2c754114f0afa68d10b"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.963339 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" event={"ID":"dfee5c28-dc5e-4f85-87d2-29925eeff49d","Type":"ContainerStarted","Data":"7de08a3dbc0e02eeb969e5614f2f0f80f0f57fe60f3ca397c8b1782bd8e58e7e"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.963400 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" event={"ID":"dfee5c28-dc5e-4f85-87d2-29925eeff49d","Type":"ContainerStarted","Data":"1605c6bae44240fa556d042f6c5336e9f67c45c198d07ed4362b34bb541e28d3"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.964370 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhzpg" event={"ID":"327b6d28-9130-4476-b8f2-edaf08da45ae","Type":"ContainerStarted","Data":"ebfd104ad9811ca00698eaba92ae09259579511ed9de851dfe44e1f395694e9c"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.965684 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" event={"ID":"e8b3f1d0-56c3-4592-9346-4282c209c0b2","Type":"ContainerStarted","Data":"8f5110f64da2178d292031a9aae2b334adf65fa6147fdf404f5e2ee30464d4df"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.966640 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhd5t" event={"ID":"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c","Type":"ContainerStarted","Data":"261e0e4811a0f1d0f6b3415ef0ed15bfa4aacc7bd14532924160b0a610642804"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.969233 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" event={"ID":"e714667b-061d-4127-8dd3-47e403ebe079","Type":"ContainerStarted","Data":"19982897ac06238ef02727c1291a8bf9ef0494a34cf9108478727c9b8ae8efa5"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.971538 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d5zks" event={"ID":"51ad1076-0ca9-4765-bd88-98f4cba434b6","Type":"ContainerStarted","Data":"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.971581 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d5zks" event={"ID":"51ad1076-0ca9-4765-bd88-98f4cba434b6","Type":"ContainerStarted","Data":"ecd988aa7f75df92b186274f83a5f514320bfb0e65e232fd39ab3c7a139e5d48"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.975562 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" event={"ID":"85e39402-3faf-4b34-a252-e4db0ac90909","Type":"ContainerStarted","Data":"59d49dbf665fa158f91d50e89347bc2e37ec2e784bb7796732941130d60392e2"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.977756 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" event={"ID":"b43c1099-b997-4be7-8390-a379e0dc5541","Type":"ContainerStarted","Data":"0e96b9083f5147e79fa7f1872f3a9277e8731145d68738a744a6b4e4f50cbc02"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.979484 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" event={"ID":"ff2ea387-8171-4259-ac56-f864a65f105f","Type":"ContainerStarted","Data":"1fec25a8a7a733694213213950321721e21d07de085ece67a3637afc15271845"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.982954 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" event={"ID":"b7308dad-19a5-4675-9874-ee0a814d8aed","Type":"ContainerStarted","Data":"162b9d5f4dfdb6a22d3863d301236ea010f2651fdc3e2fed3d977f2a7161b1b8"} Oct 09 13:51:30 crc kubenswrapper[4902]: I1009 13:51:30.984707 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" event={"ID":"6f99a811-543c-4b99-a394-9d941401efff","Type":"ContainerStarted","Data":"d36305fec150c2f18f3bde8a14c4eb23f6d98bfa1c67b1373eb748569084e7ce"} Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.024349 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.024445 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.524384411 +0000 UTC m=+38.722243475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.120151 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc"] Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.128118 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.128208 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.628157702 +0000 UTC m=+38.826016766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.128603 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.129034 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.629013657 +0000 UTC m=+38.826872731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.149306 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4"] Oct 09 13:51:31 crc kubenswrapper[4902]: W1009 13:51:31.165987 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad909d3_8d75_49d3_83c1_3ef15b47d08d.slice/crio-c6b49905331f67f8512f093c407f3f334d2e359a084ce41ab4d73a228e20add8 WatchSource:0}: Error finding container c6b49905331f67f8512f093c407f3f334d2e359a084ce41ab4d73a228e20add8: Status 404 returned error can't find the container with id c6b49905331f67f8512f093c407f3f334d2e359a084ce41ab4d73a228e20add8 Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.181948 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-b2mv7"] Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.185908 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms"] Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.223830 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh"] Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.229943 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.230455 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.730416088 +0000 UTC m=+38.928275152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.331757 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.332104 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.832091047 +0000 UTC m=+39.029950111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.435363 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.436119 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:31.936095675 +0000 UTC m=+39.133954739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.537850 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.538252 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.038233197 +0000 UTC m=+39.236092261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.581292 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jcxjn"] Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.639350 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.139316259 +0000 UTC m=+39.337175323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.639410 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.648215 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.648854 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.148834208 +0000 UTC m=+39.346693322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.777954 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.778157 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.278120086 +0000 UTC m=+39.475979150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.778368 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.779135 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.279107095 +0000 UTC m=+39.476966319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.879339 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.879838 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.379773905 +0000 UTC m=+39.577632979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.880123 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.880554 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.380537657 +0000 UTC m=+39.578396721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:31 crc kubenswrapper[4902]: I1009 13:51:31.991777 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:31 crc kubenswrapper[4902]: E1009 13:51:31.992262 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.49224166 +0000 UTC m=+39.690100724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.049806 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" event={"ID":"07f980b9-45c2-48c2-a84e-225ca4b82a77","Type":"ContainerStarted","Data":"9423edce303bb62f8e6d89afffc348ae71b1930a21bab81be39ee8062bb58692"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.053136 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" event={"ID":"fd1d9312-7008-48ff-9437-af995ef9b88d","Type":"ContainerStarted","Data":"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.053543 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.055889 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" event={"ID":"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b","Type":"ContainerStarted","Data":"e0e11f7dbd7283a1a86225ccb42aadf2edf7b1000e7fbf75904411cc870755af"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.057211 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-qhd5t" event={"ID":"6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c","Type":"ContainerStarted","Data":"a721be05715c4da526c31ec89aeebf7f105fc0ce537fd1c55a7bb6b4b5a099f4"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.065981 4902 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ttk5q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.066051 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.072639 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" event={"ID":"7ad909d3-8d75-49d3-83c1-3ef15b47d08d","Type":"ContainerStarted","Data":"c6b49905331f67f8512f093c407f3f334d2e359a084ce41ab4d73a228e20add8"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.075327 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" event={"ID":"bde9070e-6d53-4cbe-9542-4dc5ba33a16d","Type":"ContainerStarted","Data":"5252801b655fd3c1533e7275565f8d3a56eda623cd8f56ea4c14d2b84463bfa0"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.081275 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" event={"ID":"4bc89116-ce17-406a-9f82-b40535555c7f","Type":"ContainerStarted","Data":"598549ea2b793ff9b6311a2a169e7da1ced50c248dbda7639d9ee4b062648fd6"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.093483 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.093807 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.593790965 +0000 UTC m=+39.791650029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.151025 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" event={"ID":"4d3c809f-9893-4b55-bb22-759885ab8a31","Type":"ContainerStarted","Data":"a072c2600956623b4231a1f87d65ee661f02c1efbc29f2d3a742205f4093fd60"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.153713 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcxjn" event={"ID":"4c13d916-e588-4488-af07-82d5990cba9e","Type":"ContainerStarted","Data":"69e05c1a75d10df84b97d4415d15648c7857ab6dd3abb87af5e8f0ff5674bcb4"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.184459 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" event={"ID":"1fd89f10-e85a-4a04-9483-cbf8152d1ab5","Type":"ContainerStarted","Data":"0f5620f3ea6ed6301ea1d075de03a8de8d76c5f1376ef639ce3503c909e9c691"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.192948 4902 generic.go:334] "Generic (PLEG): container finished" podID="58b6a424-7606-420a-802d-1886adaa3e3d" containerID="3ec331aa095e6ef4b8b88178ef7c300d02ccced574a0f75205e79505e629767e" exitCode=0 Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.193058 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" event={"ID":"58b6a424-7606-420a-802d-1886adaa3e3d","Type":"ContainerDied","Data":"3ec331aa095e6ef4b8b88178ef7c300d02ccced574a0f75205e79505e629767e"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.194064 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.194358 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.694317921 +0000 UTC m=+39.892176985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.194796 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.194921 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.196398 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.696386081 +0000 UTC m=+39.894245335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.196987 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" event={"ID":"85e39402-3faf-4b34-a252-e4db0ac90909","Type":"ContainerStarted","Data":"7835d2c3f0411a4ff6ee9def6f179e5ef94d4caf348ba54144612ff4b13d21c1"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.209966 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-997wv" event={"ID":"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b","Type":"ContainerStarted","Data":"cd0002b80d82c62040494c7c8772ce43db6c2e72426c832e8a51fd38b531096c"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.219019 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/76bff3cb-cf9e-42cc-8b73-846ad6b38202-metrics-certs\") pod \"network-metrics-daemon-5tnbn\" (UID: \"76bff3cb-cf9e-42cc-8b73-846ad6b38202\") " pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.222491 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" event={"ID":"22155c67-aed1-4a35-989a-2777d69b3ee5","Type":"ContainerStarted","Data":"1ea4ef1c9642366ab3e042fdea05b5df553d9368bbacf9ec9ff8785a3210da8a"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.251620 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" event={"ID":"814706d7-474c-4b19-a3c2-5785ae74b60c","Type":"ContainerStarted","Data":"f6b02049f848361f08ba9d51318a7c9f2cae9f52405086cb59881f60581bf624"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.265280 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" podStartSLOduration=18.26526034 podStartE2EDuration="18.26526034s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.263696464 +0000 UTC m=+39.461555528" watchObservedRunningTime="2025-10-09 13:51:32.26526034 +0000 UTC m=+39.463119404" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.284397 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.308915 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.310231 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.810216247 +0000 UTC m=+40.008075311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.328307 4902 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6rzkc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.328343 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.328470 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5tnbn" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.339515 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f47nt"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.356569 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lm5vg" podStartSLOduration=19.356421321 podStartE2EDuration="19.356421321s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.354820764 +0000 UTC m=+39.552679858" watchObservedRunningTime="2025-10-09 13:51:32.356421321 +0000 UTC m=+39.554280395" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.400784 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" event={"ID":"93785db6-c7f9-4d9e-9407-b3653a9aa360","Type":"ContainerStarted","Data":"511e953cb4ca50142681d4f1852d0ddeab44c811a6e29a1e90561d5d0f6250e3"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.411295 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.412792 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:32.912774942 +0000 UTC m=+40.110633996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.420251 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-52czq"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.428969 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.429015 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" event={"ID":"6f99a811-543c-4b99-a394-9d941401efff","Type":"ContainerStarted","Data":"833676efb2f4c25378cd7898baf78c48d0dff00d13813f5563a263fb7c661176"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.430406 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.430496 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.432695 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" event={"ID":"36774cc0-04ea-4cab-9385-e9dc295cf63f","Type":"ContainerStarted","Data":"e64b2c4e013dc712d1292cf7cd7b5dfa4ccc2d660ba60e123fddaf6bc326582c"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.435700 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" event={"ID":"d6d59de9-a767-42f2-ae5d-e31667c52966","Type":"ContainerStarted","Data":"87e193fd3c08183967e9f87e62aded6de151f6d4bfe8fa5c1d1b3a19ac879152"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.452328 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fthsz"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.457498 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-qhd5t" podStartSLOduration=19.45741271 podStartE2EDuration="19.45741271s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.429942725 +0000 UTC m=+39.627801799" watchObservedRunningTime="2025-10-09 13:51:32.45741271 +0000 UTC m=+39.655271774" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.482649 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.486671 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.486724 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" event={"ID":"0964a12e-7b75-401b-9547-49e5a924ef0b","Type":"ContainerStarted","Data":"52f314235f66bb9342b22bd4a92c28bc6036256381ff01c8f760b5d30b5eaf37"} Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.489312 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-qb2lj"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.492174 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-snptm"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.517720 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.517959 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.017912432 +0000 UTC m=+40.215771506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.518322 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.518794 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.018775898 +0000 UTC m=+40.216634962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.525777 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.544995 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8l7cj" podStartSLOduration=19.544966085 podStartE2EDuration="19.544966085s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.524894547 +0000 UTC m=+39.722753621" watchObservedRunningTime="2025-10-09 13:51:32.544966085 +0000 UTC m=+39.742825149" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.549543 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.549622 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.559956 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.563962 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.564334 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc"] Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.587490 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-d5zks" podStartSLOduration=19.587439359 podStartE2EDuration="19.587439359s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.57416034 +0000 UTC m=+39.772019424" watchObservedRunningTime="2025-10-09 13:51:32.587439359 +0000 UTC m=+39.785298423" Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.619062 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.622647 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.12262916 +0000 UTC m=+40.320488224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.624233 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ltv8d" podStartSLOduration=19.624206337 podStartE2EDuration="19.624206337s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.621104366 +0000 UTC m=+39.818963430" watchObservedRunningTime="2025-10-09 13:51:32.624206337 +0000 UTC m=+39.822065681" Oct 09 13:51:32 crc kubenswrapper[4902]: W1009 13:51:32.632946 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c2d5abc_8893_4dda_b7a9_1de1560e9381.slice/crio-dcf8899390c5685fb88f25d7152ea9c46d86521901ee8a1c6c28141b5e4f361d WatchSource:0}: Error finding container dcf8899390c5685fb88f25d7152ea9c46d86521901ee8a1c6c28141b5e4f361d: Status 404 returned error can't find the container with id dcf8899390c5685fb88f25d7152ea9c46d86521901ee8a1c6c28141b5e4f361d Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.673962 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" podStartSLOduration=19.673923794 podStartE2EDuration="19.673923794s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:32.666337201 +0000 UTC m=+39.864196285" watchObservedRunningTime="2025-10-09 13:51:32.673923794 +0000 UTC m=+39.871782858" Oct 09 13:51:32 crc kubenswrapper[4902]: W1009 13:51:32.675169 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70af937a_5c69_4893_a069_9d968f1b1b9c.slice/crio-0558284aff3820291a7baca4326a503c4fb6da49f84cd543ba59c8396fef4d1b WatchSource:0}: Error finding container 0558284aff3820291a7baca4326a503c4fb6da49f84cd543ba59c8396fef4d1b: Status 404 returned error can't find the container with id 0558284aff3820291a7baca4326a503c4fb6da49f84cd543ba59c8396fef4d1b Oct 09 13:51:32 crc kubenswrapper[4902]: W1009 13:51:32.676197 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71406502_8050_4f28_bc5f_b5bbeeafd52f.slice/crio-f61a83662d24f300b220b47418b6ba7b389baad300f7756be43c24a6366730d5 WatchSource:0}: Error finding container f61a83662d24f300b220b47418b6ba7b389baad300f7756be43c24a6366730d5: Status 404 returned error can't find the container with id f61a83662d24f300b220b47418b6ba7b389baad300f7756be43c24a6366730d5 Oct 09 13:51:32 crc kubenswrapper[4902]: W1009 13:51:32.700795 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod472a3084_0b59_487f_b179_bfe5fa35f4a9.slice/crio-c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf WatchSource:0}: Error finding container c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf: Status 404 returned error can't find the container with id c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.722690 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.723158 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.223130415 +0000 UTC m=+40.420989479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.824620 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.825372 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.325320049 +0000 UTC m=+40.523179113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.825531 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.826012 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.325989149 +0000 UTC m=+40.523848213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:32 crc kubenswrapper[4902]: I1009 13:51:32.939856 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:32 crc kubenswrapper[4902]: E1009 13:51:32.940801 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.440777852 +0000 UTC m=+40.638636916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.041733 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.042545 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.542529464 +0000 UTC m=+40.740388528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.171333 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.171961 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.671930875 +0000 UTC m=+40.869790109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.272656 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.273941 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.773928814 +0000 UTC m=+40.971787878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.318354 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5tnbn"] Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.376183 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.376618 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.876601312 +0000 UTC m=+41.074460376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: W1009 13:51:33.387230 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76bff3cb_cf9e_42cc_8b73_846ad6b38202.slice/crio-4bdc69a14238195d21de1b3f4adcaea40ef13ec4846ca48fe43379c5a5d13b0c WatchSource:0}: Error finding container 4bdc69a14238195d21de1b3f4adcaea40ef13ec4846ca48fe43379c5a5d13b0c: Status 404 returned error can't find the container with id 4bdc69a14238195d21de1b3f4adcaea40ef13ec4846ca48fe43379c5a5d13b0c Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.443734 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:33 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:33 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:33 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.444002 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.479378 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.479841 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:33.979823837 +0000 UTC m=+41.177682901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.583903 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.584203 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.084184825 +0000 UTC m=+41.282043889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.638701 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" event={"ID":"7ad909d3-8d75-49d3-83c1-3ef15b47d08d","Type":"ContainerStarted","Data":"9656e0fe6b017ab9b30fbb41aef7fc7a842b6eeea7a7784a431b4cc6fc29e3a4"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.676406 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" event={"ID":"1fd89f10-e85a-4a04-9483-cbf8152d1ab5","Type":"ContainerStarted","Data":"0c9d734a9b2c925b58b2a795f85ecf431a6a5b50e19fce4e136d3a4137df586b"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.692971 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.695630 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.19561571 +0000 UTC m=+41.393474774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.724891 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" event={"ID":"b7308dad-19a5-4675-9874-ee0a814d8aed","Type":"ContainerStarted","Data":"e04daa393c0506c6eeaa59358e8855a70b34ef92275779c51b61ce3b510e4b21"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.752464 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fthsz" event={"ID":"4251505b-3c9f-4b37-8805-8d984504a89e","Type":"ContainerStarted","Data":"e6c318e8a30f7b68f7e6284971d882764173f005fce90ab93f08c47f8f245770"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.793174 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" event={"ID":"36774cc0-04ea-4cab-9385-e9dc295cf63f","Type":"ContainerStarted","Data":"85bd2d41aa4ffc331d309329211a25e4c4647e66b77c4d6b82f524fc6169ce05"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.794704 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.795246 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.295217287 +0000 UTC m=+41.493076351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.814099 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" event={"ID":"472a3084-0b59-487f-b179-bfe5fa35f4a9","Type":"ContainerStarted","Data":"c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.862768 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhzpg" event={"ID":"327b6d28-9130-4476-b8f2-edaf08da45ae","Type":"ContainerStarted","Data":"3116d5a50ae6fbb7e9239a6d88a477e8b5658c0b7391ee9bc532bb77c050814a"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.863966 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.871313 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-snptm" event={"ID":"71406502-8050-4f28-bc5f-b5bbeeafd52f","Type":"ContainerStarted","Data":"f61a83662d24f300b220b47418b6ba7b389baad300f7756be43c24a6366730d5"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.900460 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.900536 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.903468 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:33 crc kubenswrapper[4902]: E1009 13:51:33.905713 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.405700694 +0000 UTC m=+41.603559748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.911770 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" event={"ID":"4bc89116-ce17-406a-9f82-b40535555c7f","Type":"ContainerStarted","Data":"80c6127cda7e480ae491f582bb65bdc346f68031036a816599f59949662ab49e"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.911829 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.923480 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" event={"ID":"07f980b9-45c2-48c2-a84e-225ca4b82a77","Type":"ContainerStarted","Data":"ce85d240d5fc77d5622f6c59b76c82c31efe6b1197f91db6f91986c1f8194b63"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.939369 4902 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-th8gh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.939451 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" podUID="4bc89116-ce17-406a-9f82-b40535555c7f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.943322 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-997wv" event={"ID":"4f73c993-5f7c-46c4-92b1-f0b2f2ee2a8b","Type":"ContainerStarted","Data":"406cad1a5904d3fcad5c2b5275e4e2fd542dd3fa9dc29e82044a84378a885fbd"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.953458 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" event={"ID":"b43c1099-b997-4be7-8390-a379e0dc5541","Type":"ContainerStarted","Data":"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.955358 4902 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6rzkc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.955446 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.957003 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5tnbn" event={"ID":"76bff3cb-cf9e-42cc-8b73-846ad6b38202","Type":"ContainerStarted","Data":"4bdc69a14238195d21de1b3f4adcaea40ef13ec4846ca48fe43379c5a5d13b0c"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.958557 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" event={"ID":"a2933475-7af8-41e3-9389-114c1969b030","Type":"ContainerStarted","Data":"0df32c7dca5eb011f8316801dcc478f0821eca5b355b56704fa9b1a9960dcca4"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.963500 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" event={"ID":"814706d7-474c-4b19-a3c2-5785ae74b60c","Type":"ContainerStarted","Data":"d19151221a504b7adf6214dd93217378ea4ae5d5cdbdf10b9082c83c147dfd29"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.976087 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" event={"ID":"8621111f-521a-48d9-886b-20299f771b70","Type":"ContainerStarted","Data":"81e7d3d5a1775cf09595e483a170f007556210bd3dae01f9896d216725010eed"} Oct 09 13:51:33 crc kubenswrapper[4902]: I1009 13:51:33.990011 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" event={"ID":"6f99a811-543c-4b99-a394-9d941401efff","Type":"ContainerStarted","Data":"871098493ef0b97292c9d9cfec907dd22c155e64f68b773faf49b0b47ccfe0c4"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.013316 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.013463 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.51340952 +0000 UTC m=+41.711268584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.013724 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.014996 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" event={"ID":"22155c67-aed1-4a35-989a-2777d69b3ee5","Type":"ContainerStarted","Data":"58922a7c20a037678f4d0034e10f1e3c0373e4cb7d8e2ca686f31759eb669148"} Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.018311 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.515288045 +0000 UTC m=+41.713147109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.032513 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" event={"ID":"70af937a-5c69-4893-a069-9d968f1b1b9c","Type":"ContainerStarted","Data":"0558284aff3820291a7baca4326a503c4fb6da49f84cd543ba59c8396fef4d1b"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.034449 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.046009 4902 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4rdw2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.046072 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" podUID="70af937a-5c69-4893-a069-9d968f1b1b9c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.061978 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" event={"ID":"4ff922ea-aac7-4128-b619-658b4f44ce6e","Type":"ContainerStarted","Data":"df1471c10631514b01f5e11144ceb669b543ee31c84fa5098b46001848dc12f8"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.089293 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" event={"ID":"bde9070e-6d53-4cbe-9542-4dc5ba33a16d","Type":"ContainerStarted","Data":"95282f23c743381bee568e6b1656f832652f71f7084bf52bf948307a45cd2e64"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.121042 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.121148 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.621124806 +0000 UTC m=+41.818983870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.124737 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.125313 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.625296328 +0000 UTC m=+41.823155392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.133219 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" event={"ID":"0964a12e-7b75-401b-9547-49e5a924ef0b","Type":"ContainerStarted","Data":"b8b71edd3a925f71b121381dbe4f0703cdc89cc8ef159ce4fe0bce9c2fdc308f"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.164138 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" event={"ID":"93785db6-c7f9-4d9e-9407-b3653a9aa360","Type":"ContainerStarted","Data":"e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.164531 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.170524 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" event={"ID":"534001de-19e9-45bd-b2fe-42b9521447a0","Type":"ContainerStarted","Data":"a846a3292f0784dff28419529a04b064c1dc62b566c4b8f0ebaf9b8c5d68df43"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.194460 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" event={"ID":"58b6a424-7606-420a-802d-1886adaa3e3d","Type":"ContainerStarted","Data":"3387fc23f5cccf5eb625a46e1a1b0c010679d602a16b71b60927c3dfc8bf903e"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.206789 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" event={"ID":"7ae518f0-243e-4916-89cb-0e621793d4db","Type":"ContainerStarted","Data":"09c75092a9765f117c8654bdd43d9a7568288d28353b1ace6022152808aea89d"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.217696 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" event={"ID":"3c2d5abc-8893-4dda-b7a9-1de1560e9381","Type":"ContainerStarted","Data":"dcf8899390c5685fb88f25d7152ea9c46d86521901ee8a1c6c28141b5e4f361d"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.226446 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.228826 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.236009 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.728681608 +0000 UTC m=+41.926540662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.241054 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" event={"ID":"b5ae0ea3-4302-4125-aee7-b6ee8276a000","Type":"ContainerStarted","Data":"7feab552e78e7054d3a46c560a721bcc88127fb51835b1b9c8ef188235277e3a"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.245561 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" event={"ID":"dfee5c28-dc5e-4f85-87d2-29925eeff49d","Type":"ContainerStarted","Data":"2a39aa06fb7200161bbcb474cded9da85f0e32c5c035a65bac18f2f6e2949712"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.259754 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" podStartSLOduration=20.259733457 podStartE2EDuration="20.259733457s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.222983811 +0000 UTC m=+41.420842885" watchObservedRunningTime="2025-10-09 13:51:34.259733457 +0000 UTC m=+41.457592521" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.260836 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdsh5" podStartSLOduration=21.260824789 podStartE2EDuration="21.260824789s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.257629936 +0000 UTC m=+41.455489020" watchObservedRunningTime="2025-10-09 13:51:34.260824789 +0000 UTC m=+41.458683853" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.276573 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" event={"ID":"84438d18-f42f-44cd-8129-7dfd9edfed87","Type":"ContainerStarted","Data":"48a674f2553894f87b177083446e95eb4ca6e0b2c1f5426ae5185a8144b33e26"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.277954 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.290676 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-997wv" podStartSLOduration=7.290645093 podStartE2EDuration="7.290645093s" podCreationTimestamp="2025-10-09 13:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.280164896 +0000 UTC m=+41.478023960" watchObservedRunningTime="2025-10-09 13:51:34.290645093 +0000 UTC m=+41.488504157" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.290864 4902 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7wjtm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.290919 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" podUID="84438d18-f42f-44cd-8129-7dfd9edfed87" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.325033 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcxjn" event={"ID":"4c13d916-e588-4488-af07-82d5990cba9e","Type":"ContainerStarted","Data":"7bee6f7607ff3b65e15f8c56e2e88ce0c11854ee5f6c8f07340a880367062088"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.327850 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.336889 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" podStartSLOduration=21.336857447 podStartE2EDuration="21.336857447s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.322471226 +0000 UTC m=+41.520330300" watchObservedRunningTime="2025-10-09 13:51:34.336857447 +0000 UTC m=+41.534716531" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.359061 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.359798 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8db2k" podStartSLOduration=20.359764458 podStartE2EDuration="20.359764458s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.358966445 +0000 UTC m=+41.556825529" watchObservedRunningTime="2025-10-09 13:51:34.359764458 +0000 UTC m=+41.557623522" Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.360943 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.860923182 +0000 UTC m=+42.058782326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.364790 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" event={"ID":"28116576-5069-4dd6-90f1-31582eda88df","Type":"ContainerStarted","Data":"3646583eb5c8362dab8b66fe78c8d41ebe941b346172d7b693e9cbdddaf9e908"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.365824 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.372605 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" event={"ID":"4d3c809f-9893-4b55-bb22-759885ab8a31","Type":"ContainerStarted","Data":"d364c34af071474cd3146e493a3633eedfac8cd93f0597178ea12123fe39041d"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.416453 4902 generic.go:334] "Generic (PLEG): container finished" podID="2425c7bf-9eeb-4255-bfd0-1d8ef07d835b" containerID="711787d815a8cf371bbb30fd223ba13c6aa278ca79353edc12da4461e97fd4f9" exitCode=0 Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.416514 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" event={"ID":"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b","Type":"ContainerDied","Data":"711787d815a8cf371bbb30fd223ba13c6aa278ca79353edc12da4461e97fd4f9"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.436809 4902 generic.go:334] "Generic (PLEG): container finished" podID="e714667b-061d-4127-8dd3-47e403ebe079" containerID="6ac768990de766c15893f1b3bfa3fcdf3e24329446b3e1124d8d74ac34388db5" exitCode=0 Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.436901 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" event={"ID":"e714667b-061d-4127-8dd3-47e403ebe079","Type":"ContainerDied","Data":"6ac768990de766c15893f1b3bfa3fcdf3e24329446b3e1124d8d74ac34388db5"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.437661 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:34 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:34 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:34 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.437710 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.459883 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.460230 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:34.960214562 +0000 UTC m=+42.158073626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.463660 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" event={"ID":"e8b3f1d0-56c3-4592-9346-4282c209c0b2","Type":"ContainerStarted","Data":"7f54f812317c281b42d76dc91200a8b6a79d200abb559b808292bfedb2d016bc"} Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.477163 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" podStartSLOduration=20.477143288 podStartE2EDuration="20.477143288s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.412859494 +0000 UTC m=+41.610718568" watchObservedRunningTime="2025-10-09 13:51:34.477143288 +0000 UTC m=+41.675002352" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.477555 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.478642 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" podStartSLOduration=20.478636111 podStartE2EDuration="20.478636111s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.474880571 +0000 UTC m=+41.672739625" watchObservedRunningTime="2025-10-09 13:51:34.478636111 +0000 UTC m=+41.676495175" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.515294 4902 scope.go:117] "RemoveContainer" containerID="801afd8fea3db4a9b2864ac031b128c77ffb050da8e948d01378e85a22f075a0" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.531110 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" podStartSLOduration=20.531091268 podStartE2EDuration="20.531091268s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.510124414 +0000 UTC m=+41.707983478" watchObservedRunningTime="2025-10-09 13:51:34.531091268 +0000 UTC m=+41.728950322" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.564379 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.572563 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.072538673 +0000 UTC m=+42.270397737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.607515 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bszj2" podStartSLOduration=20.607491007 podStartE2EDuration="20.607491007s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.597118373 +0000 UTC m=+41.794977437" watchObservedRunningTime="2025-10-09 13:51:34.607491007 +0000 UTC m=+41.805350071" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.609313 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.609514 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.667374 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.667778 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.167759103 +0000 UTC m=+42.365618167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.670724 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" podStartSLOduration=20.670681198 podStartE2EDuration="20.670681198s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.659198952 +0000 UTC m=+41.857058026" watchObservedRunningTime="2025-10-09 13:51:34.670681198 +0000 UTC m=+41.868540272" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.745289 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.749815 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.768764 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.770754 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.770872 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" podStartSLOduration=21.770861554 podStartE2EDuration="21.770861554s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.726629998 +0000 UTC m=+41.924489062" watchObservedRunningTime="2025-10-09 13:51:34.770861554 +0000 UTC m=+41.968720618" Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.771227 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.271207254 +0000 UTC m=+42.469066318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.780266 4902 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-4cxxv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.780342 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" podUID="58b6a424-7606-420a-802d-1886adaa3e3d" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.780825 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.826299 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" podStartSLOduration=20.826279527 podStartE2EDuration="20.826279527s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.769895095 +0000 UTC m=+41.967754169" watchObservedRunningTime="2025-10-09 13:51:34.826279527 +0000 UTC m=+42.024138591" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.835775 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xd6rl" podStartSLOduration=21.835747375 podStartE2EDuration="21.835747375s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.823188037 +0000 UTC m=+42.021047121" watchObservedRunningTime="2025-10-09 13:51:34.835747375 +0000 UTC m=+42.033606439" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.847249 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-b2mv7" podStartSLOduration=21.847233081 podStartE2EDuration="21.847233081s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.846816549 +0000 UTC m=+42.044675623" watchObservedRunningTime="2025-10-09 13:51:34.847233081 +0000 UTC m=+42.045092145" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.871760 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:34 crc kubenswrapper[4902]: E1009 13:51:34.872045 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.372029948 +0000 UTC m=+42.569889012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.885383 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podStartSLOduration=7.885365059 podStartE2EDuration="7.885365059s" podCreationTimestamp="2025-10-09 13:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.883575576 +0000 UTC m=+42.081434650" watchObservedRunningTime="2025-10-09 13:51:34.885365059 +0000 UTC m=+42.083224123" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.920632 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bhzpg" podStartSLOduration=21.920616272 podStartE2EDuration="21.920616272s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.918474719 +0000 UTC m=+42.116333793" watchObservedRunningTime="2025-10-09 13:51:34.920616272 +0000 UTC m=+42.118475336" Oct 09 13:51:34 crc kubenswrapper[4902]: I1009 13:51:34.964065 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-45ck4" podStartSLOduration=20.964049144 podStartE2EDuration="20.964049144s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:34.944046058 +0000 UTC m=+42.141905142" watchObservedRunningTime="2025-10-09 13:51:34.964049144 +0000 UTC m=+42.161908208" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:34.996743 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.008114 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.508085324 +0000 UTC m=+42.705944388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.071487 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rq4g9" podStartSLOduration=22.071458111 podStartE2EDuration="22.071458111s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.070276417 +0000 UTC m=+42.268135481" watchObservedRunningTime="2025-10-09 13:51:35.071458111 +0000 UTC m=+42.269317175" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.097723 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.098048 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.59803115 +0000 UTC m=+42.795890214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.164153 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" podStartSLOduration=22.164135007 podStartE2EDuration="22.164135007s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.127620337 +0000 UTC m=+42.325479411" watchObservedRunningTime="2025-10-09 13:51:35.164135007 +0000 UTC m=+42.361994061" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.208723 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.209261 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.709241788 +0000 UTC m=+42.907100852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.223534 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kkvj6" podStartSLOduration=21.223515457 podStartE2EDuration="21.223515457s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.197437233 +0000 UTC m=+42.395296307" watchObservedRunningTime="2025-10-09 13:51:35.223515457 +0000 UTC m=+42.421374521" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.224938 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" podStartSLOduration=21.224932098 podStartE2EDuration="21.224932098s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.222290641 +0000 UTC m=+42.420149705" watchObservedRunningTime="2025-10-09 13:51:35.224932098 +0000 UTC m=+42.422791162" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.251045 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" podStartSLOduration=21.251017892 podStartE2EDuration="21.251017892s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.248329114 +0000 UTC m=+42.446188198" watchObservedRunningTime="2025-10-09 13:51:35.251017892 +0000 UTC m=+42.448876956" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.301372 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" podStartSLOduration=21.301355307 podStartE2EDuration="21.301355307s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.300450581 +0000 UTC m=+42.498309645" watchObservedRunningTime="2025-10-09 13:51:35.301355307 +0000 UTC m=+42.499214371" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.311026 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.311434 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.811403392 +0000 UTC m=+43.009262456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.327915 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jcxjn" podStartSLOduration=8.327896775 podStartE2EDuration="8.327896775s" podCreationTimestamp="2025-10-09 13:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.325573217 +0000 UTC m=+42.523432281" watchObservedRunningTime="2025-10-09 13:51:35.327896775 +0000 UTC m=+42.525755839" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.415653 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.416199 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:35.916188532 +0000 UTC m=+43.114047596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.443004 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:35 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:35 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:35 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.443072 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.508559 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" event={"ID":"8621111f-521a-48d9-886b-20299f771b70","Type":"ContainerStarted","Data":"dc0eae6401205882fa4fb215414be76d2768f0708c3e6085e217d94116737c9a"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.508610 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" event={"ID":"8621111f-521a-48d9-886b-20299f771b70","Type":"ContainerStarted","Data":"caf47a4f4aebca7497dfd981c3e164af3507d1c78b6f11b5a7db66f1bdd81800"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.519586 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.519969 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.019951462 +0000 UTC m=+43.217810526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.535897 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pqbtl" event={"ID":"7ae518f0-243e-4916-89cb-0e621793d4db","Type":"ContainerStarted","Data":"11e3670144ebee669c41befc43c505cedd350f907a46aae74afddd77f0e5fb21"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.584116 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kxnmp" podStartSLOduration=21.584079441 podStartE2EDuration="21.584079441s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.583983418 +0000 UTC m=+42.781842502" watchObservedRunningTime="2025-10-09 13:51:35.584079441 +0000 UTC m=+42.781938505" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.587703 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qs2nx" event={"ID":"e8b3f1d0-56c3-4592-9346-4282c209c0b2","Type":"ContainerStarted","Data":"554f3d39ff6dbc02ab8a466d607f45629ba960a7e77796c5d0ac65a929bc02b9"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.618705 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-n2nl8" event={"ID":"36774cc0-04ea-4cab-9385-e9dc295cf63f","Type":"ContainerStarted","Data":"dc0e52ae8d93d93f688b2ef7d671cc7264653aa516a4f8fbec2e82b6491751a7"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.632024 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.635509 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.135491018 +0000 UTC m=+43.333350082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.647746 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-52czq" event={"ID":"4ff922ea-aac7-4128-b619-658b4f44ce6e","Type":"ContainerStarted","Data":"444ccc04bfc187422cc57b1346606abf9974b4305317a25cd619e4e1c97b825c"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.697824 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5tnbn" event={"ID":"76bff3cb-cf9e-42cc-8b73-846ad6b38202","Type":"ContainerStarted","Data":"23d5abc8311a21e05b00c2d1514df29af199d9a19c98d5a702ba2dcf8fe23863"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.698343 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5tnbn" event={"ID":"76bff3cb-cf9e-42cc-8b73-846ad6b38202","Type":"ContainerStarted","Data":"654c00fec1c1a27effd0c0b99ead048a68f44b35cb0a563903edecb0a2a33c46"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.729535 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" event={"ID":"3c2d5abc-8893-4dda-b7a9-1de1560e9381","Type":"ContainerStarted","Data":"24cc95adcf1d7d1603793e7213b4566b4a9aff8a87bee62a3a60ccdee2136e5b"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.729604 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" event={"ID":"3c2d5abc-8893-4dda-b7a9-1de1560e9381","Type":"ContainerStarted","Data":"235bae83a558135754734406b530f6af7248b09c74d629a6235f7e9276f57d6b"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.734259 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.736713 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.236686923 +0000 UTC m=+43.434545987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.750556 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcxjn" event={"ID":"4c13d916-e588-4488-af07-82d5990cba9e","Type":"ContainerStarted","Data":"04e934c34539d710d737ed9a99a55ab45869aff23fb8e67850d4300ebc9aa2dd"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.791193 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" event={"ID":"b5ae0ea3-4302-4125-aee7-b6ee8276a000","Type":"ContainerStarted","Data":"27ba2b27c8c10419d0aed14eed32e20e88e554b918c1b1f8c5f19e9ffb591e01"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.791246 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" event={"ID":"b5ae0ea3-4302-4125-aee7-b6ee8276a000","Type":"ContainerStarted","Data":"d840beaf6d2bdf8f33b41637da8be1011c5d0f6018cef3c26388f9d60429cb32"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.798337 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.798942 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5tnbn" podStartSLOduration=22.798931067 podStartE2EDuration="22.798931067s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.778851908 +0000 UTC m=+42.976710982" watchObservedRunningTime="2025-10-09 13:51:35.798931067 +0000 UTC m=+42.996790131" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.799522 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-qb2lj" podStartSLOduration=21.799514474 podStartE2EDuration="21.799514474s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.799189634 +0000 UTC m=+42.997048698" watchObservedRunningTime="2025-10-09 13:51:35.799514474 +0000 UTC m=+42.997373538" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.836651 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.838719 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.338701252 +0000 UTC m=+43.536560396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.856707 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" event={"ID":"07f980b9-45c2-48c2-a84e-225ca4b82a77","Type":"ContainerStarted","Data":"65294fc1f9bfaada87d219f60ab4cb31a9472ffd672f83c428a1c8404f41f7ee"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.898884 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jgbgc" podStartSLOduration=21.898856604 podStartE2EDuration="21.898856604s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.896850596 +0000 UTC m=+43.094709660" watchObservedRunningTime="2025-10-09 13:51:35.898856604 +0000 UTC m=+43.096715668" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.899940 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" podStartSLOduration=21.899932596 podStartE2EDuration="21.899932596s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:35.85398245 +0000 UTC m=+43.051841514" watchObservedRunningTime="2025-10-09 13:51:35.899932596 +0000 UTC m=+43.097791680" Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.912131 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4ffxb" event={"ID":"b7308dad-19a5-4675-9874-ee0a814d8aed","Type":"ContainerStarted","Data":"a65efe7d7075725ad49b7a2bd093400756e2c58826361e4252358dbac544cc9f"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.942654 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fthsz" event={"ID":"4251505b-3c9f-4b37-8805-8d984504a89e","Type":"ContainerStarted","Data":"da014069c30bb034b4dae73994879d6c8a0c9012f4f1baad71ce290a24f11a08"} Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.942930 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.943348 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.443326967 +0000 UTC m=+43.641186031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.943517 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:35 crc kubenswrapper[4902]: E1009 13:51:35.945005 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.444988836 +0000 UTC m=+43.642847910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:35 crc kubenswrapper[4902]: I1009 13:51:35.980654 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" event={"ID":"70af937a-5c69-4893-a069-9d968f1b1b9c","Type":"ContainerStarted","Data":"36d10b21c3c79175070356bdfff87e3b0304f17ce73fc4a42d3ffd361a7af4e7"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.012976 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" event={"ID":"22155c67-aed1-4a35-989a-2777d69b3ee5","Type":"ContainerStarted","Data":"cfda3b165e7676e524ba90c2d5b2fb77085458a51ff15e321524a899e9aa9172"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.022556 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4rdw2" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.046917 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.047027 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.547009595 +0000 UTC m=+43.744868659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.047452 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.048369 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.548345484 +0000 UTC m=+43.746204548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.048765 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8nknx" event={"ID":"d6d59de9-a767-42f2-ae5d-e31667c52966","Type":"ContainerStarted","Data":"55eacf90a7ead8a255c894972a80ea7f57e999f23b336cf4eade200052fb0616"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.059899 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fthsz" podStartSLOduration=9.059876622 podStartE2EDuration="9.059876622s" podCreationTimestamp="2025-10-09 13:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:36.053006471 +0000 UTC m=+43.250865535" watchObservedRunningTime="2025-10-09 13:51:36.059876622 +0000 UTC m=+43.257735686" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.069250 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" event={"ID":"2425c7bf-9eeb-4255-bfd0-1d8ef07d835b","Type":"ContainerStarted","Data":"a9210a6cbd189141bb12bebc04a84c1816942c812d51c499b7f3d044b00aa574"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.069321 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.091693 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" event={"ID":"a2933475-7af8-41e3-9389-114c1969b030","Type":"ContainerStarted","Data":"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.092665 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.094386 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" event={"ID":"84438d18-f42f-44cd-8129-7dfd9edfed87","Type":"ContainerStarted","Data":"2f50f57e0de02d61391d010ff9121bf4a6a4f434c68e5c6ce088f5bdf0a9e53f"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.095817 4902 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qvwhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.095854 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.097299 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f47nt" event={"ID":"534001de-19e9-45bd-b2fe-42b9521447a0","Type":"ContainerStarted","Data":"4eb0a739d2633deae8faad57635c36ce5109e5717b5f5afe350bb2ec5a476546"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.108556 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" event={"ID":"472a3084-0b59-487f-b179-bfe5fa35f4a9","Type":"ContainerStarted","Data":"435a2b57957251a2cdf6ece37fdecfb6425022ee4dc2dd66661fd282f4db9451"} Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.114691 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.114732 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.129825 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.130210 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7wjtm" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.141579 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-th8gh" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.151586 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.154094 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.654073822 +0000 UTC m=+43.851932886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.210842 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-fsvms" podStartSLOduration=23.210817815 podStartE2EDuration="23.210817815s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:36.146085388 +0000 UTC m=+43.343944472" watchObservedRunningTime="2025-10-09 13:51:36.210817815 +0000 UTC m=+43.408676879" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.256345 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.256687 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.756675759 +0000 UTC m=+43.954534823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.357056 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.357414 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.85739226 +0000 UTC m=+44.055251324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.392041 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" podStartSLOduration=23.392024544 podStartE2EDuration="23.392024544s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:36.388541962 +0000 UTC m=+43.586401036" watchObservedRunningTime="2025-10-09 13:51:36.392024544 +0000 UTC m=+43.589883608" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.434923 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:36 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:36 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:36 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.435015 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.478341 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.478721 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:36.978709014 +0000 UTC m=+44.176568068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.520976 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-d6wkv"] Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.581203 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.581686 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.081660581 +0000 UTC m=+44.279519645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.675635 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" podStartSLOduration=22.675618604 podStartE2EDuration="22.675618604s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:36.657810922 +0000 UTC m=+43.855669986" watchObservedRunningTime="2025-10-09 13:51:36.675618604 +0000 UTC m=+43.873477668" Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.685191 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.685578 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.185561095 +0000 UTC m=+44.383420159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.786986 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.787438 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.287363598 +0000 UTC m=+44.485222662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.787753 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.788158 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.288141761 +0000 UTC m=+44.486000835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.889543 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.889820 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.389786769 +0000 UTC m=+44.587645833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:36 crc kubenswrapper[4902]: I1009 13:51:36.991369 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:36 crc kubenswrapper[4902]: E1009 13:51:36.991889 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.49187321 +0000 UTC m=+44.689732274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.092581 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.092831 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.592803338 +0000 UTC m=+44.790662402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.092899 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.093270 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.593256181 +0000 UTC m=+44.791115245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.120097 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.128031 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63406deb9e617fe9fb7a284143b9a7bdcdef3747617bde1c7b98294acea7b0ff"} Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.128699 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.137274 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" event={"ID":"e714667b-061d-4127-8dd3-47e403ebe079","Type":"ContainerStarted","Data":"fb436ffacb4585983a8a3953fc210f31ab123645dd059b87d9ed6dca0e09f7b8"} Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.137322 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" event={"ID":"e714667b-061d-4127-8dd3-47e403ebe079","Type":"ContainerStarted","Data":"76c57e07b6d8fefffd3a8d123361a4a4cf58c112342c2a27d5d75fb6190bcb1e"} Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.170668 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-snptm" event={"ID":"71406502-8050-4f28-bc5f-b5bbeeafd52f","Type":"ContainerStarted","Data":"7ba118cea89725f65f20b7aabefbf39674826dd4f1e305c0c8d53e0d3717fc06"} Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.170717 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-snptm" event={"ID":"71406502-8050-4f28-bc5f-b5bbeeafd52f","Type":"ContainerStarted","Data":"e5f24b5c3026c389678ab4a35f62888d520e04430b5816add4a304736ae1fedb"} Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.171558 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.171539585 podStartE2EDuration="23.171539585s" podCreationTimestamp="2025-10-09 13:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:37.169965719 +0000 UTC m=+44.367824773" watchObservedRunningTime="2025-10-09 13:51:37.171539585 +0000 UTC m=+44.369398649" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.175238 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" podStartSLOduration=24.175229043 podStartE2EDuration="24.175229043s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:36.779721954 +0000 UTC m=+43.977581018" watchObservedRunningTime="2025-10-09 13:51:37.175229043 +0000 UTC m=+44.373088117" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.176845 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.176892 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.177808 4902 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qvwhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.177844 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.197822 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.197961 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.697947268 +0000 UTC m=+44.895806332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.198260 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.200548 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.700523974 +0000 UTC m=+44.898383038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.225281 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" podStartSLOduration=24.225264709 podStartE2EDuration="24.225264709s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:37.22462471 +0000 UTC m=+44.422483804" watchObservedRunningTime="2025-10-09 13:51:37.225264709 +0000 UTC m=+44.423123773" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.246905 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.248255 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.250117 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.269522 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.299182 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.299697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwf6v\" (UniqueName: \"kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.299987 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.301235 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.801210454 +0000 UTC m=+44.999069518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.302816 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.410828 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwf6v\" (UniqueName: \"kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.411483 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.411555 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.412464 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.412786 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.413109 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.413308 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:37.913289167 +0000 UTC m=+45.111148231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.425324 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.426355 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.436487 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.437001 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:37 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:37 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:37 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.437062 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.446577 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.470143 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwf6v\" (UniqueName: \"kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v\") pod \"certified-operators-56j52\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.516142 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.516505 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.516550 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbt82\" (UniqueName: \"kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.516584 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.516781 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.016751128 +0000 UTC m=+45.214610202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.570944 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.618728 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.618771 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbt82\" (UniqueName: \"kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.618798 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.618827 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.619118 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.119106847 +0000 UTC m=+45.316965911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.620121 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.620806 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.639852 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.640834 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.692542 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbt82\" (UniqueName: \"kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82\") pod \"community-operators-wlbhl\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.719947 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.720210 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.220169979 +0000 UTC m=+45.418029043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.720675 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fmh4\" (UniqueName: \"kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.720738 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.720768 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.720794 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.721221 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.221196299 +0000 UTC m=+45.419055363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.727852 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.778277 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.820120 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.821383 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.821936 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.822178 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.322128776 +0000 UTC m=+45.519987850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.822394 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fmh4\" (UniqueName: \"kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.822688 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.822732 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.822781 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.823563 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.823569 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.323549548 +0000 UTC m=+45.521408612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.823903 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.852809 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fmh4\" (UniqueName: \"kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4\") pod \"certified-operators-p6zbr\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.878364 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhhpn" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.923791 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.924157 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.924199 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlq62\" (UniqueName: \"kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.924259 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:37 crc kubenswrapper[4902]: E1009 13:51:37.924453 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.424404953 +0000 UTC m=+45.622264037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.964070 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:51:37 crc kubenswrapper[4902]: I1009 13:51:37.985580 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.026444 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.026525 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlq62\" (UniqueName: \"kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.026575 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.026624 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.027198 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.027590 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.527574296 +0000 UTC m=+45.725433360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.027663 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.064514 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlq62\" (UniqueName: \"kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62\") pod \"community-operators-jqjng\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.128013 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.128273 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.628231845 +0000 UTC m=+45.826090909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.128620 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.129228 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.629219234 +0000 UTC m=+45.827078298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.140997 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.236217 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.236387 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.736355843 +0000 UTC m=+45.934214907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.236516 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.237093 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.737069854 +0000 UTC m=+45.934928918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.247232 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-snptm" event={"ID":"71406502-8050-4f28-bc5f-b5bbeeafd52f","Type":"ContainerStarted","Data":"0eaf692bab6bfe9f3361d31f13ba179757bce5af2394614774d7db0da4f60e0d"} Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.247780 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" gracePeriod=30 Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.251056 4902 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qvwhz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.251090 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.340175 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.340482 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.840437913 +0000 UTC m=+46.038296977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.341151 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.345109 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.845090909 +0000 UTC m=+46.042949973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.380674 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.436560 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:38 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:38 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:38 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.436638 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.442237 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.442683 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:38.942665708 +0000 UTC m=+46.140524772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.479945 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.480759 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.517007 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.517002 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.517206 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.545347 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.545989 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.546069 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.547454 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.047413147 +0000 UTC m=+46.245272211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.647407 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.647861 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.647932 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.647998 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.648072 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.148055336 +0000 UTC m=+46.345914400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.697015 4902 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.709892 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.742103 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.750281 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.750723 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.250706573 +0000 UTC m=+46.448565637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: W1009 13:51:38.775915 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1a8762_c90b_4ff8_a836_47ca3d3ec932.slice/crio-06958e871a66a24029bb6b806ac6af5c359c6052d0f06adad29dfafeca3931f3 WatchSource:0}: Error finding container 06958e871a66a24029bb6b806ac6af5c359c6052d0f06adad29dfafeca3931f3: Status 404 returned error can't find the container with id 06958e871a66a24029bb6b806ac6af5c359c6052d0f06adad29dfafeca3931f3 Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.779266 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:51:38 crc kubenswrapper[4902]: W1009 13:51:38.784545 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f26005_88e7_4f48_917e_5b2696925564.slice/crio-de7b2073840f35f700c074f6ebd40cee4dcd00238342975ab752869523b8c6eb WatchSource:0}: Error finding container de7b2073840f35f700c074f6ebd40cee4dcd00238342975ab752869523b8c6eb: Status 404 returned error can't find the container with id de7b2073840f35f700c074f6ebd40cee4dcd00238342975ab752869523b8c6eb Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.848753 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.850958 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.851432 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.351393064 +0000 UTC m=+46.549252128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.869057 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:51:38 crc kubenswrapper[4902]: I1009 13:51:38.952885 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:38 crc kubenswrapper[4902]: E1009 13:51:38.953488 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.453465975 +0000 UTC m=+46.651325039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.055158 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.055305 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.555275568 +0000 UTC m=+46.753134632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.055615 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.055956 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.555941287 +0000 UTC m=+46.753800351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.156975 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.157451 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.657435901 +0000 UTC m=+46.855294965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.165861 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 13:51:39 crc kubenswrapper[4902]: W1009 13:51:39.223089 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4f280b3_1679_425a_9ce3_c11469353c6a.slice/crio-c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b WatchSource:0}: Error finding container c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b: Status 404 returned error can't find the container with id c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.234818 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.236153 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.238313 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.249180 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.253867 4902 generic.go:334] "Generic (PLEG): container finished" podID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerID="c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480" exitCode=0 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.254008 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerDied","Data":"c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.254305 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerStarted","Data":"0724c820c870ce65d0bba5984c5ece2bda35c25e407cfe81d32b988e0180ae97"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.255677 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4f280b3-1679-425a-9ce3-c11469353c6a","Type":"ContainerStarted","Data":"c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.260739 4902 generic.go:334] "Generic (PLEG): container finished" podID="c5f26005-88e7-4f48-917e-5b2696925564" containerID="4a53fd2633d3e59b7a645fc802ba7bbcb73ed35591d475b0c17f34bf725b9ad8" exitCode=0 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.260819 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerDied","Data":"4a53fd2633d3e59b7a645fc802ba7bbcb73ed35591d475b0c17f34bf725b9ad8"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.260845 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerStarted","Data":"de7b2073840f35f700c074f6ebd40cee4dcd00238342975ab752869523b8c6eb"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.260856 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.262223 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.262288 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.262314 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrdt\" (UniqueName: \"kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.262397 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.262728 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.762712235 +0000 UTC m=+46.960571319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.274993 4902 generic.go:334] "Generic (PLEG): container finished" podID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerID="9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42" exitCode=0 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.275135 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerDied","Data":"9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.275178 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerStarted","Data":"06958e871a66a24029bb6b806ac6af5c359c6052d0f06adad29dfafeca3931f3"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.295797 4902 generic.go:334] "Generic (PLEG): container finished" podID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerID="fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1" exitCode=0 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.296734 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerDied","Data":"fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.296770 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerStarted","Data":"182b69bccfdd00eeda9c4d2f64dfc1c87cbe8d1a649942ef1ecdab1e15ab88b4"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.301638 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-snptm" event={"ID":"71406502-8050-4f28-bc5f-b5bbeeafd52f","Type":"ContainerStarted","Data":"00f42dc4a9f4e2bb3da74dfe1eed7db761bcdf1a9c42bcd6165a460f13099d62"} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.363226 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.363524 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.863452457 +0000 UTC m=+47.061311521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.363779 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.363939 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.363973 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrdt\" (UniqueName: \"kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.364182 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.364822 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.364904 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.365959 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.86594715 +0000 UTC m=+47.063806204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.407554 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-snptm" podStartSLOduration=12.407511458 podStartE2EDuration="12.407511458s" podCreationTimestamp="2025-10-09 13:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:39.401814201 +0000 UTC m=+46.599673265" watchObservedRunningTime="2025-10-09 13:51:39.407511458 +0000 UTC m=+46.605370522" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.418938 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrdt\" (UniqueName: \"kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt\") pod \"redhat-marketplace-p6dtr\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.433722 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:39 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:39 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:39 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.433799 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.466845 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.467459 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 13:51:39.967436044 +0000 UTC m=+47.165295108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.561921 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.568785 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: E1009 13:51:39.569174 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 13:51:40.069158624 +0000 UTC m=+47.267017688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q76zz" (UID: "836dca77-634b-42e7-bf76-74b582e0969d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.614195 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.615116 4902 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-09T13:51:38.697044551Z","Handler":null,"Name":""} Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.615513 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.641663 4902 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.641729 4902 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.671671 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.672533 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.672644 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrmm2\" (UniqueName: \"kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.678746 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.679649 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.687399 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.757355 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.757635 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.757412 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.757850 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.761024 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.765231 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.765264 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.773901 4902 patch_prober.go:28] interesting pod/console-f9d7485db-d5zks container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.773964 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-d5zks" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.774209 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4cxxv" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.782338 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.782440 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.782475 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.782516 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrmm2\" (UniqueName: \"kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.783294 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.783596 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.789911 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.789973 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.790517 4902 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.790569 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.800845 4902 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mzkkz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]log ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]etcd ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/generic-apiserver-start-informers ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/max-in-flight-filter ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 09 13:51:39 crc kubenswrapper[4902]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 09 13:51:39 crc kubenswrapper[4902]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/project.openshift.io-projectcache ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/openshift.io-startinformers ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 09 13:51:39 crc kubenswrapper[4902]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 09 13:51:39 crc kubenswrapper[4902]: livez check failed Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.800916 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" podUID="e714667b-061d-4127-8dd3-47e403ebe079" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.804022 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrmm2\" (UniqueName: \"kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2\") pod \"redhat-marketplace-p72x4\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.839944 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q76zz\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.928144 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:51:39 crc kubenswrapper[4902]: W1009 13:51:39.968119 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1edf05_690e_463e_8086_e4ba20653475.slice/crio-cd01274f326fb7c79cc596a79b498221ec833e99ea9720ae16a6ca3c558063e9 WatchSource:0}: Error finding container cd01274f326fb7c79cc596a79b498221ec833e99ea9720ae16a6ca3c558063e9: Status 404 returned error can't find the container with id cd01274f326fb7c79cc596a79b498221ec833e99ea9720ae16a6ca3c558063e9 Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.981953 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:51:39 crc kubenswrapper[4902]: I1009 13:51:39.999953 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.249484 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:51:40 crc kubenswrapper[4902]: W1009 13:51:40.274258 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod583d6403_9f97_4d51_9a43_f0d2fedf80f2.slice/crio-3c16c60cc3aa247ac868ab81c900bee02f27fa14f4e8781719b7045b41742306 WatchSource:0}: Error finding container 3c16c60cc3aa247ac868ab81c900bee02f27fa14f4e8781719b7045b41742306: Status 404 returned error can't find the container with id 3c16c60cc3aa247ac868ab81c900bee02f27fa14f4e8781719b7045b41742306 Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.304222 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.314911 4902 generic.go:334] "Generic (PLEG): container finished" podID="a4f280b3-1679-425a-9ce3-c11469353c6a" containerID="816c3421edb2cd8dee557fa1db13041f34c81dadf8d148f6046e2442f6d074e2" exitCode=0 Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.315006 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4f280b3-1679-425a-9ce3-c11469353c6a","Type":"ContainerDied","Data":"816c3421edb2cd8dee557fa1db13041f34c81dadf8d148f6046e2442f6d074e2"} Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.319061 4902 generic.go:334] "Generic (PLEG): container finished" podID="2a1edf05-690e-463e-8086-e4ba20653475" containerID="e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd" exitCode=0 Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.319213 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerDied","Data":"e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd"} Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.319260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerStarted","Data":"cd01274f326fb7c79cc596a79b498221ec833e99ea9720ae16a6ca3c558063e9"} Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.323096 4902 generic.go:334] "Generic (PLEG): container finished" podID="472a3084-0b59-487f-b179-bfe5fa35f4a9" containerID="435a2b57957251a2cdf6ece37fdecfb6425022ee4dc2dd66661fd282f4db9451" exitCode=0 Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.323837 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" event={"ID":"472a3084-0b59-487f-b179-bfe5fa35f4a9","Type":"ContainerDied","Data":"435a2b57957251a2cdf6ece37fdecfb6425022ee4dc2dd66661fd282f4db9451"} Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.328991 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerStarted","Data":"3c16c60cc3aa247ac868ab81c900bee02f27fa14f4e8781719b7045b41742306"} Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.416022 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.418071 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.420981 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.426966 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.432393 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:40 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:40 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:40 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.433325 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.434056 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.495263 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krqkn\" (UniqueName: \"kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.495399 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.495443 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.596666 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krqkn\" (UniqueName: \"kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.596778 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.596806 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.597360 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.597988 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.614055 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.623764 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krqkn\" (UniqueName: \"kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn\") pod \"redhat-operators-8r47d\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: E1009 13:51:40.629309 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:40 crc kubenswrapper[4902]: E1009 13:51:40.632001 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:40 crc kubenswrapper[4902]: E1009 13:51:40.639549 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:40 crc kubenswrapper[4902]: E1009 13:51:40.639642 4902 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.755513 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.818244 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.823017 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.828408 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.902614 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.902679 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9f92\" (UniqueName: \"kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.902937 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:40 crc kubenswrapper[4902]: I1009 13:51:40.999054 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.004720 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.004808 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.004841 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9f92\" (UniqueName: \"kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.005764 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.005796 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: W1009 13:51:41.011243 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45688fd0_799f_477e_ae28_ef494a1abdc5.slice/crio-41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5 WatchSource:0}: Error finding container 41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5: Status 404 returned error can't find the container with id 41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5 Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.024180 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9f92\" (UniqueName: \"kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92\") pod \"redhat-operators-rsvbq\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.146779 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.337900 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" event={"ID":"836dca77-634b-42e7-bf76-74b582e0969d","Type":"ContainerStarted","Data":"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd"} Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.338469 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.338487 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" event={"ID":"836dca77-634b-42e7-bf76-74b582e0969d","Type":"ContainerStarted","Data":"dfe079df19624112dd38632188d2c8c87fbf6965ecd217b2e16192e5f2c5777a"} Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.342458 4902 generic.go:334] "Generic (PLEG): container finished" podID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerID="d669232d4275d19556a2dca2810b022f4f10f6af157feebfac38bc0051c3c8f4" exitCode=0 Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.342556 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerDied","Data":"d669232d4275d19556a2dca2810b022f4f10f6af157feebfac38bc0051c3c8f4"} Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.342609 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerStarted","Data":"41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5"} Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.346871 4902 generic.go:334] "Generic (PLEG): container finished" podID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerID="27a0ec78f6db1f93dd5f6d029bab752ecdc679ecbdb37a0bf048d3751a5c6b76" exitCode=0 Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.346998 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerDied","Data":"27a0ec78f6db1f93dd5f6d029bab752ecdc679ecbdb37a0bf048d3751a5c6b76"} Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.404717 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" podStartSLOduration=28.404695705 podStartE2EDuration="28.404695705s" podCreationTimestamp="2025-10-09 13:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:41.387033198 +0000 UTC m=+48.584892282" watchObservedRunningTime="2025-10-09 13:51:41.404695705 +0000 UTC m=+48.602554779" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.434016 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:41 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:41 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:41 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.434093 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.535808 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.588379 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:51:41 crc kubenswrapper[4902]: W1009 13:51:41.626240 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod779ce800_3df4_4e67_8096_c2097269e1ad.slice/crio-041c018e89fb371a7c7d66b77b3468a11d58f79c2e2058493d8d4c4e957395ec WatchSource:0}: Error finding container 041c018e89fb371a7c7d66b77b3468a11d58f79c2e2058493d8d4c4e957395ec: Status 404 returned error can't find the container with id 041c018e89fb371a7c7d66b77b3468a11d58f79c2e2058493d8d4c4e957395ec Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.704703 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.718388 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir\") pod \"a4f280b3-1679-425a-9ce3-c11469353c6a\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.718547 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a4f280b3-1679-425a-9ce3-c11469353c6a" (UID: "a4f280b3-1679-425a-9ce3-c11469353c6a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.718845 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access\") pod \"a4f280b3-1679-425a-9ce3-c11469353c6a\" (UID: \"a4f280b3-1679-425a-9ce3-c11469353c6a\") " Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.719134 4902 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a4f280b3-1679-425a-9ce3-c11469353c6a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.732559 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a4f280b3-1679-425a-9ce3-c11469353c6a" (UID: "a4f280b3-1679-425a-9ce3-c11469353c6a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.756089 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.819556 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume\") pod \"472a3084-0b59-487f-b179-bfe5fa35f4a9\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.819614 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs\") pod \"472a3084-0b59-487f-b179-bfe5fa35f4a9\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.819681 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume\") pod \"472a3084-0b59-487f-b179-bfe5fa35f4a9\" (UID: \"472a3084-0b59-487f-b179-bfe5fa35f4a9\") " Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.821070 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "472a3084-0b59-487f-b179-bfe5fa35f4a9" (UID: "472a3084-0b59-487f-b179-bfe5fa35f4a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.821300 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4f280b3-1679-425a-9ce3-c11469353c6a-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.821335 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/472a3084-0b59-487f-b179-bfe5fa35f4a9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.826671 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "472a3084-0b59-487f-b179-bfe5fa35f4a9" (UID: "472a3084-0b59-487f-b179-bfe5fa35f4a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.827015 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs" (OuterVolumeSpecName: "kube-api-access-p78bs") pod "472a3084-0b59-487f-b179-bfe5fa35f4a9" (UID: "472a3084-0b59-487f-b179-bfe5fa35f4a9"). InnerVolumeSpecName "kube-api-access-p78bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.923566 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78bs\" (UniqueName: \"kubernetes.io/projected/472a3084-0b59-487f-b179-bfe5fa35f4a9-kube-api-access-p78bs\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:41 crc kubenswrapper[4902]: I1009 13:51:41.923608 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/472a3084-0b59-487f-b179-bfe5fa35f4a9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.379475 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.388043 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc" event={"ID":"472a3084-0b59-487f-b179-bfe5fa35f4a9","Type":"ContainerDied","Data":"c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf"} Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.388095 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74bdfcade9917f2a6dbe6e35e4542a8d559f019741e8db460ee0b1432772bbf" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.396184 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a4f280b3-1679-425a-9ce3-c11469353c6a","Type":"ContainerDied","Data":"c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b"} Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.396211 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d5db4ceece10ed20df9cdcdfce6a83bd1c5d173dda63829cc35c77ffe6db1b" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.396228 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.401252 4902 generic.go:334] "Generic (PLEG): container finished" podID="779ce800-3df4-4e67-8096-c2097269e1ad" containerID="b8c1036bcfc4ca3b0ac71c4e92628e443c9084a4232ec90228ab8fa2ee9b478f" exitCode=0 Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.402196 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerDied","Data":"b8c1036bcfc4ca3b0ac71c4e92628e443c9084a4232ec90228ab8fa2ee9b478f"} Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.402221 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerStarted","Data":"041c018e89fb371a7c7d66b77b3468a11d58f79c2e2058493d8d4c4e957395ec"} Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.441744 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:42 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:42 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:42 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.441826 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.479186 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 13:51:42 crc kubenswrapper[4902]: E1009 13:51:42.481701 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472a3084-0b59-487f-b179-bfe5fa35f4a9" containerName="collect-profiles" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.481716 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="472a3084-0b59-487f-b179-bfe5fa35f4a9" containerName="collect-profiles" Oct 09 13:51:42 crc kubenswrapper[4902]: E1009 13:51:42.481733 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f280b3-1679-425a-9ce3-c11469353c6a" containerName="pruner" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.481739 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f280b3-1679-425a-9ce3-c11469353c6a" containerName="pruner" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.481852 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f280b3-1679-425a-9ce3-c11469353c6a" containerName="pruner" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.481866 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="472a3084-0b59-487f-b179-bfe5fa35f4a9" containerName="collect-profiles" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.482260 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.486995 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.487272 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.487481 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.522297 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.553713 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.638955 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.639066 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.741204 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.741334 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.741356 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.773899 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:42 crc kubenswrapper[4902]: I1009 13:51:42.825161 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:43 crc kubenswrapper[4902]: I1009 13:51:43.433034 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:43 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:43 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:43 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:43 crc kubenswrapper[4902]: I1009 13:51:43.433653 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:43 crc kubenswrapper[4902]: I1009 13:51:43.461534 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 13:51:43 crc kubenswrapper[4902]: W1009 13:51:43.486008 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a89302a_15e3_470f_b938_cf4c1e97a44e.slice/crio-79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e WatchSource:0}: Error finding container 79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e: Status 404 returned error can't find the container with id 79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e Oct 09 13:51:43 crc kubenswrapper[4902]: I1009 13:51:43.555108 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.5550901320000001 podStartE2EDuration="1.555090132s" podCreationTimestamp="2025-10-09 13:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:43.552835036 +0000 UTC m=+50.750694110" watchObservedRunningTime="2025-10-09 13:51:43.555090132 +0000 UTC m=+50.752949197" Oct 09 13:51:44 crc kubenswrapper[4902]: I1009 13:51:44.430616 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:44 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:44 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:44 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:44 crc kubenswrapper[4902]: I1009 13:51:44.431224 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:44 crc kubenswrapper[4902]: I1009 13:51:44.432333 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a89302a-15e3-470f-b938-cf4c1e97a44e","Type":"ContainerStarted","Data":"79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e"} Oct 09 13:51:44 crc kubenswrapper[4902]: I1009 13:51:44.795929 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:44 crc kubenswrapper[4902]: I1009 13:51:44.802157 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mzkkz" Oct 09 13:51:45 crc kubenswrapper[4902]: I1009 13:51:45.434389 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:45 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:45 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:45 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:45 crc kubenswrapper[4902]: I1009 13:51:45.434472 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:45 crc kubenswrapper[4902]: I1009 13:51:45.447230 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a89302a-15e3-470f-b938-cf4c1e97a44e","Type":"ContainerStarted","Data":"e636f6801198692c11e68f981e56778418bf0617c3755c50f19deaf2ac35331f"} Oct 09 13:51:45 crc kubenswrapper[4902]: I1009 13:51:45.461400 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 13:51:45 crc kubenswrapper[4902]: I1009 13:51:45.638485 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jcxjn" Oct 09 13:51:46 crc kubenswrapper[4902]: I1009 13:51:46.430140 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:46 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:46 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:46 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:46 crc kubenswrapper[4902]: I1009 13:51:46.430320 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:46 crc kubenswrapper[4902]: I1009 13:51:46.469626 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.469593608 podStartE2EDuration="4.469593608s" podCreationTimestamp="2025-10-09 13:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:51:46.468457184 +0000 UTC m=+53.666316258" watchObservedRunningTime="2025-10-09 13:51:46.469593608 +0000 UTC m=+53.667452672" Oct 09 13:51:47 crc kubenswrapper[4902]: I1009 13:51:47.430173 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:47 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:47 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:47 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:47 crc kubenswrapper[4902]: I1009 13:51:47.430255 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:47 crc kubenswrapper[4902]: I1009 13:51:47.461824 4902 generic.go:334] "Generic (PLEG): container finished" podID="5a89302a-15e3-470f-b938-cf4c1e97a44e" containerID="e636f6801198692c11e68f981e56778418bf0617c3755c50f19deaf2ac35331f" exitCode=0 Oct 09 13:51:47 crc kubenswrapper[4902]: I1009 13:51:47.461883 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a89302a-15e3-470f-b938-cf4c1e97a44e","Type":"ContainerDied","Data":"e636f6801198692c11e68f981e56778418bf0617c3755c50f19deaf2ac35331f"} Oct 09 13:51:48 crc kubenswrapper[4902]: I1009 13:51:48.431095 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:48 crc kubenswrapper[4902]: [-]has-synced failed: reason withheld Oct 09 13:51:48 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:48 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:48 crc kubenswrapper[4902]: I1009 13:51:48.431198 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.431229 4902 patch_prober.go:28] interesting pod/router-default-5444994796-qhd5t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 13:51:49 crc kubenswrapper[4902]: [+]has-synced ok Oct 09 13:51:49 crc kubenswrapper[4902]: [+]process-running ok Oct 09 13:51:49 crc kubenswrapper[4902]: healthz check failed Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.431877 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-qhd5t" podUID="6b7c78b1-850e-44f5-b50e-2e2ed4c0bb9c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.765620 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.765682 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.765731 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.765817 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.766747 4902 patch_prober.go:28] interesting pod/console-f9d7485db-d5zks container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 09 13:51:49 crc kubenswrapper[4902]: I1009 13:51:49.766792 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-d5zks" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 09 13:51:50 crc kubenswrapper[4902]: I1009 13:51:50.433455 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:50 crc kubenswrapper[4902]: I1009 13:51:50.436083 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-qhd5t" Oct 09 13:51:50 crc kubenswrapper[4902]: E1009 13:51:50.632536 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:50 crc kubenswrapper[4902]: E1009 13:51:50.634956 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:50 crc kubenswrapper[4902]: E1009 13:51:50.637330 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:51:50 crc kubenswrapper[4902]: E1009 13:51:50.637454 4902 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.748201 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.900785 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir\") pod \"5a89302a-15e3-470f-b938-cf4c1e97a44e\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.900919 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access\") pod \"5a89302a-15e3-470f-b938-cf4c1e97a44e\" (UID: \"5a89302a-15e3-470f-b938-cf4c1e97a44e\") " Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.901127 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a89302a-15e3-470f-b938-cf4c1e97a44e" (UID: "5a89302a-15e3-470f-b938-cf4c1e97a44e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.901327 4902 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a89302a-15e3-470f-b938-cf4c1e97a44e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:55 crc kubenswrapper[4902]: I1009 13:51:55.909143 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a89302a-15e3-470f-b938-cf4c1e97a44e" (UID: "5a89302a-15e3-470f-b938-cf4c1e97a44e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:51:56 crc kubenswrapper[4902]: I1009 13:51:56.002157 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a89302a-15e3-470f-b938-cf4c1e97a44e-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 13:51:56 crc kubenswrapper[4902]: I1009 13:51:56.559147 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a89302a-15e3-470f-b938-cf4c1e97a44e","Type":"ContainerDied","Data":"79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e"} Oct 09 13:51:56 crc kubenswrapper[4902]: I1009 13:51:56.559235 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cbd948bdc2d072aacf172e3a556c106250b6e03efaf0ac3670527db06f819e" Oct 09 13:51:56 crc kubenswrapper[4902]: I1009 13:51:56.559192 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.440137 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.757348 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.757429 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.757353 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.757944 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.758072 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.758738 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.758840 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.758982 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"3116d5a50ae6fbb7e9239a6d88a477e8b5658c0b7391ee9bc532bb77c050814a"} pod="openshift-console/downloads-7954f5f757-bhzpg" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.759147 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" containerID="cri-o://3116d5a50ae6fbb7e9239a6d88a477e8b5658c0b7391ee9bc532bb77c050814a" gracePeriod=2 Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.803740 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:51:59 crc kubenswrapper[4902]: I1009 13:51:59.808326 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 13:52:00 crc kubenswrapper[4902]: I1009 13:52:00.005737 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:52:00 crc kubenswrapper[4902]: I1009 13:52:00.590002 4902 generic.go:334] "Generic (PLEG): container finished" podID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerID="3116d5a50ae6fbb7e9239a6d88a477e8b5658c0b7391ee9bc532bb77c050814a" exitCode=0 Oct 09 13:52:00 crc kubenswrapper[4902]: I1009 13:52:00.590102 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhzpg" event={"ID":"327b6d28-9130-4476-b8f2-edaf08da45ae","Type":"ContainerDied","Data":"3116d5a50ae6fbb7e9239a6d88a477e8b5658c0b7391ee9bc532bb77c050814a"} Oct 09 13:52:00 crc kubenswrapper[4902]: E1009 13:52:00.628731 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:00 crc kubenswrapper[4902]: E1009 13:52:00.630136 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:00 crc kubenswrapper[4902]: E1009 13:52:00.631293 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:00 crc kubenswrapper[4902]: E1009 13:52:00.631494 4902 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:52:08 crc kubenswrapper[4902]: I1009 13:52:08.531675 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 09 13:52:09 crc kubenswrapper[4902]: I1009 13:52:09.646689 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-d6wkv_93785db6-c7f9-4d9e-9407-b3653a9aa360/kube-multus-additional-cni-plugins/0.log" Oct 09 13:52:09 crc kubenswrapper[4902]: I1009 13:52:09.646986 4902 generic.go:334] "Generic (PLEG): container finished" podID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" exitCode=137 Oct 09 13:52:09 crc kubenswrapper[4902]: I1009 13:52:09.647023 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" event={"ID":"93785db6-c7f9-4d9e-9407-b3653a9aa360","Type":"ContainerDied","Data":"e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07"} Oct 09 13:52:09 crc kubenswrapper[4902]: I1009 13:52:09.757154 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:52:09 crc kubenswrapper[4902]: I1009 13:52:09.757226 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:52:10 crc kubenswrapper[4902]: E1009 13:52:10.627852 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07 is running failed: container process not found" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:10 crc kubenswrapper[4902]: E1009 13:52:10.628292 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07 is running failed: container process not found" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:10 crc kubenswrapper[4902]: E1009 13:52:10.628638 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07 is running failed: container process not found" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" cmd=["/bin/bash","-c","test -f /ready/ready"] Oct 09 13:52:10 crc kubenswrapper[4902]: E1009 13:52:10.628711 4902 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:52:10 crc kubenswrapper[4902]: I1009 13:52:10.854194 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2224d" Oct 09 13:52:10 crc kubenswrapper[4902]: I1009 13:52:10.885264 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.885239163 podStartE2EDuration="2.885239163s" podCreationTimestamp="2025-10-09 13:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:52:10.884510922 +0000 UTC m=+78.082369986" watchObservedRunningTime="2025-10-09 13:52:10.885239163 +0000 UTC m=+78.083098227" Oct 09 13:52:11 crc kubenswrapper[4902]: E1009 13:52:11.641613 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 13:52:11 crc kubenswrapper[4902]: E1009 13:52:11.641884 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xrmm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p72x4_openshift-marketplace(583d6403-9f97-4d51-9a43-f0d2fedf80f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:11 crc kubenswrapper[4902]: E1009 13:52:11.643549 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p72x4" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.390034 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p72x4" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.477670 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.478140 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5fmh4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p6zbr_openshift-marketplace(07049d7d-6aec-4446-bca4-51a81e2e6a29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.481177 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p6zbr" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.492647 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.492808 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vbt82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wlbhl_openshift-marketplace(cc1a8762-c90b-4ff8-a836-47ca3d3ec932): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.495387 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wlbhl" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.507536 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.507674 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wrdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p6dtr_openshift-marketplace(2a1edf05-690e-463e-8086-e4ba20653475): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:14 crc kubenswrapper[4902]: E1009 13:52:14.510222 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p6dtr" podUID="2a1edf05-690e-463e-8086-e4ba20653475" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.867932 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wlbhl" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.868647 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p6dtr" podUID="2a1edf05-690e-463e-8086-e4ba20653475" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.868858 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p6zbr" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.900419 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.900621 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r9f92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rsvbq_openshift-marketplace(779ce800-3df4-4e67-8096-c2097269e1ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.902050 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rsvbq" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" Oct 09 13:52:17 crc kubenswrapper[4902]: I1009 13:52:17.965633 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-d6wkv_93785db6-c7f9-4d9e-9407-b3653a9aa360/kube-multus-additional-cni-plugins/0.log" Oct 09 13:52:17 crc kubenswrapper[4902]: I1009 13:52:17.965712 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.978672 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.978954 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krqkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8r47d_openshift-marketplace(45688fd0-799f-477e-ae28-ef494a1abdc5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 13:52:17 crc kubenswrapper[4902]: E1009 13:52:17.980173 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8r47d" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.008251 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist\") pod \"93785db6-c7f9-4d9e-9407-b3653a9aa360\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.008295 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g72b\" (UniqueName: \"kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b\") pod \"93785db6-c7f9-4d9e-9407-b3653a9aa360\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.008355 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready\") pod \"93785db6-c7f9-4d9e-9407-b3653a9aa360\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.008398 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir\") pod \"93785db6-c7f9-4d9e-9407-b3653a9aa360\" (UID: \"93785db6-c7f9-4d9e-9407-b3653a9aa360\") " Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.008742 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "93785db6-c7f9-4d9e-9407-b3653a9aa360" (UID: "93785db6-c7f9-4d9e-9407-b3653a9aa360"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.009004 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready" (OuterVolumeSpecName: "ready") pod "93785db6-c7f9-4d9e-9407-b3653a9aa360" (UID: "93785db6-c7f9-4d9e-9407-b3653a9aa360"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.009363 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "93785db6-c7f9-4d9e-9407-b3653a9aa360" (UID: "93785db6-c7f9-4d9e-9407-b3653a9aa360"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.020370 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b" (OuterVolumeSpecName: "kube-api-access-6g72b") pod "93785db6-c7f9-4d9e-9407-b3653a9aa360" (UID: "93785db6-c7f9-4d9e-9407-b3653a9aa360"). InnerVolumeSpecName "kube-api-access-6g72b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.110373 4902 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/93785db6-c7f9-4d9e-9407-b3653a9aa360-ready\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.110752 4902 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93785db6-c7f9-4d9e-9407-b3653a9aa360-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.110775 4902 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/93785db6-c7f9-4d9e-9407-b3653a9aa360-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.110790 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g72b\" (UniqueName: \"kubernetes.io/projected/93785db6-c7f9-4d9e-9407-b3653a9aa360-kube-api-access-6g72b\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.704842 4902 generic.go:334] "Generic (PLEG): container finished" podID="c5f26005-88e7-4f48-917e-5b2696925564" containerID="98271019d20d7927ef7e4729b8153afc5a0f4e768b33cf0384e82f58fd8555a2" exitCode=0 Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.704915 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerDied","Data":"98271019d20d7927ef7e4729b8153afc5a0f4e768b33cf0384e82f58fd8555a2"} Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.707621 4902 generic.go:334] "Generic (PLEG): container finished" podID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerID="a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9" exitCode=0 Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.707680 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerDied","Data":"a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9"} Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.716659 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bhzpg" event={"ID":"327b6d28-9130-4476-b8f2-edaf08da45ae","Type":"ContainerStarted","Data":"1ab61101ad6758cea811201ab29a4ebcd60b747aa7bd613a71de4b14da08db06"} Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.717154 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.718210 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.718274 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.719856 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-d6wkv_93785db6-c7f9-4d9e-9407-b3653a9aa360/kube-multus-additional-cni-plugins/0.log" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.720366 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.721859 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-d6wkv" event={"ID":"93785db6-c7f9-4d9e-9407-b3653a9aa360","Type":"ContainerDied","Data":"511e953cb4ca50142681d4f1852d0ddeab44c811a6e29a1e90561d5d0f6250e3"} Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.721949 4902 scope.go:117] "RemoveContainer" containerID="e9815120efe1d478b3ec0bd85d0bd2203ec391a217e75129a79aaa7221dbca07" Oct 09 13:52:18 crc kubenswrapper[4902]: E1009 13:52:18.723638 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rsvbq" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" Oct 09 13:52:18 crc kubenswrapper[4902]: E1009 13:52:18.723498 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8r47d" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.813559 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-d6wkv"] Oct 09 13:52:18 crc kubenswrapper[4902]: I1009 13:52:18.823334 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-d6wkv"] Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.521857 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" path="/var/lib/kubelet/pods/93785db6-c7f9-4d9e-9407-b3653a9aa360/volumes" Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.734037 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerStarted","Data":"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f"} Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.740016 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerStarted","Data":"beb4d9af4a5f2f471e206e6ad5608e404f1b6b9635566f4306bed9c54bfec52a"} Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.740526 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.740565 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.755325 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-56j52" podStartSLOduration=2.895266789 podStartE2EDuration="42.755303846s" podCreationTimestamp="2025-10-09 13:51:37 +0000 UTC" firstStartedPulling="2025-10-09 13:51:39.297486494 +0000 UTC m=+46.495345558" lastFinishedPulling="2025-10-09 13:52:19.157523501 +0000 UTC m=+86.355382615" observedRunningTime="2025-10-09 13:52:19.753189484 +0000 UTC m=+86.951048558" watchObservedRunningTime="2025-10-09 13:52:19.755303846 +0000 UTC m=+86.953162930" Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.756744 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.756813 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.758501 4902 patch_prober.go:28] interesting pod/downloads-7954f5f757-bhzpg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Oct 09 13:52:19 crc kubenswrapper[4902]: I1009 13:52:19.758574 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bhzpg" podUID="327b6d28-9130-4476-b8f2-edaf08da45ae" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Oct 09 13:52:27 crc kubenswrapper[4902]: I1009 13:52:27.571970 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:52:27 crc kubenswrapper[4902]: I1009 13:52:27.572495 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:52:27 crc kubenswrapper[4902]: I1009 13:52:27.727548 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:52:27 crc kubenswrapper[4902]: I1009 13:52:27.751205 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jqjng" podStartSLOduration=10.666602101 podStartE2EDuration="50.751153477s" podCreationTimestamp="2025-10-09 13:51:37 +0000 UTC" firstStartedPulling="2025-10-09 13:51:39.265562139 +0000 UTC m=+46.463421203" lastFinishedPulling="2025-10-09 13:52:19.350113505 +0000 UTC m=+86.547972579" observedRunningTime="2025-10-09 13:52:19.774239157 +0000 UTC m=+86.972098231" watchObservedRunningTime="2025-10-09 13:52:27.751153477 +0000 UTC m=+94.949012571" Oct 09 13:52:27 crc kubenswrapper[4902]: I1009 13:52:27.847261 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:52:28 crc kubenswrapper[4902]: I1009 13:52:28.142608 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:28 crc kubenswrapper[4902]: I1009 13:52:28.142695 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:28 crc kubenswrapper[4902]: I1009 13:52:28.201347 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:28 crc kubenswrapper[4902]: I1009 13:52:28.851193 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:29 crc kubenswrapper[4902]: I1009 13:52:29.371146 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:52:29 crc kubenswrapper[4902]: I1009 13:52:29.763575 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bhzpg" Oct 09 13:52:29 crc kubenswrapper[4902]: I1009 13:52:29.816867 4902 generic.go:334] "Generic (PLEG): container finished" podID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerID="a462a785a6d4b31a273e133ac761141a994cc879be8754fd6221a2df0ebddcef" exitCode=0 Oct 09 13:52:29 crc kubenswrapper[4902]: I1009 13:52:29.817032 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerDied","Data":"a462a785a6d4b31a273e133ac761141a994cc879be8754fd6221a2df0ebddcef"} Oct 09 13:52:30 crc kubenswrapper[4902]: I1009 13:52:30.823694 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jqjng" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="registry-server" containerID="cri-o://beb4d9af4a5f2f471e206e6ad5608e404f1b6b9635566f4306bed9c54bfec52a" gracePeriod=2 Oct 09 13:52:31 crc kubenswrapper[4902]: I1009 13:52:31.834605 4902 generic.go:334] "Generic (PLEG): container finished" podID="c5f26005-88e7-4f48-917e-5b2696925564" containerID="beb4d9af4a5f2f471e206e6ad5608e404f1b6b9635566f4306bed9c54bfec52a" exitCode=0 Oct 09 13:52:31 crc kubenswrapper[4902]: I1009 13:52:31.834685 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerDied","Data":"beb4d9af4a5f2f471e206e6ad5608e404f1b6b9635566f4306bed9c54bfec52a"} Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.376792 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.474996 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities\") pod \"c5f26005-88e7-4f48-917e-5b2696925564\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.475052 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content\") pod \"c5f26005-88e7-4f48-917e-5b2696925564\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.475088 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlq62\" (UniqueName: \"kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62\") pod \"c5f26005-88e7-4f48-917e-5b2696925564\" (UID: \"c5f26005-88e7-4f48-917e-5b2696925564\") " Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.475951 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities" (OuterVolumeSpecName: "utilities") pod "c5f26005-88e7-4f48-917e-5b2696925564" (UID: "c5f26005-88e7-4f48-917e-5b2696925564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.497230 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62" (OuterVolumeSpecName: "kube-api-access-tlq62") pod "c5f26005-88e7-4f48-917e-5b2696925564" (UID: "c5f26005-88e7-4f48-917e-5b2696925564"). InnerVolumeSpecName "kube-api-access-tlq62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.531949 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5f26005-88e7-4f48-917e-5b2696925564" (UID: "c5f26005-88e7-4f48-917e-5b2696925564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.576595 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.576638 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5f26005-88e7-4f48-917e-5b2696925564-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.576656 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlq62\" (UniqueName: \"kubernetes.io/projected/c5f26005-88e7-4f48-917e-5b2696925564-kube-api-access-tlq62\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.867461 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jqjng" event={"ID":"c5f26005-88e7-4f48-917e-5b2696925564","Type":"ContainerDied","Data":"de7b2073840f35f700c074f6ebd40cee4dcd00238342975ab752869523b8c6eb"} Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.867516 4902 scope.go:117] "RemoveContainer" containerID="beb4d9af4a5f2f471e206e6ad5608e404f1b6b9635566f4306bed9c54bfec52a" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.867561 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jqjng" Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.901534 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:52:36 crc kubenswrapper[4902]: I1009 13:52:36.901584 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jqjng"] Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.241231 4902 scope.go:117] "RemoveContainer" containerID="98271019d20d7927ef7e4729b8153afc5a0f4e768b33cf0384e82f58fd8555a2" Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.274854 4902 scope.go:117] "RemoveContainer" containerID="4a53fd2633d3e59b7a645fc802ba7bbcb73ed35591d475b0c17f34bf725b9ad8" Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.520457 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f26005-88e7-4f48-917e-5b2696925564" path="/var/lib/kubelet/pods/c5f26005-88e7-4f48-917e-5b2696925564/volumes" Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.874066 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerStarted","Data":"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f"} Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.876532 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerStarted","Data":"7288abee2650e0b02facf3e853632ac1f713159a86eb3f2726d66deafea24336"} Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.878154 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerStarted","Data":"a9a2298c9c3178cc4c750a94b6c40932f39b9edae3080882662396ed14fa7b1b"} Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.881695 4902 generic.go:334] "Generic (PLEG): container finished" podID="2a1edf05-690e-463e-8086-e4ba20653475" containerID="ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f" exitCode=0 Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.881786 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerDied","Data":"ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f"} Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.883878 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerStarted","Data":"398ccb668b391ae1688821308e4ff4b2bd0d8e2eede3e49a7e51b45870c2c287"} Oct 09 13:52:37 crc kubenswrapper[4902]: I1009 13:52:37.967048 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p72x4" podStartSLOduration=3.02954252 podStartE2EDuration="58.967028502s" podCreationTimestamp="2025-10-09 13:51:39 +0000 UTC" firstStartedPulling="2025-10-09 13:51:41.350859088 +0000 UTC m=+48.548718152" lastFinishedPulling="2025-10-09 13:52:37.28834507 +0000 UTC m=+104.486204134" observedRunningTime="2025-10-09 13:52:37.964125106 +0000 UTC m=+105.161984170" watchObservedRunningTime="2025-10-09 13:52:37.967028502 +0000 UTC m=+105.164887566" Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.892153 4902 generic.go:334] "Generic (PLEG): container finished" podID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerID="b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d" exitCode=0 Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.892371 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerDied","Data":"b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d"} Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.895047 4902 generic.go:334] "Generic (PLEG): container finished" podID="779ce800-3df4-4e67-8096-c2097269e1ad" containerID="a9a2298c9c3178cc4c750a94b6c40932f39b9edae3080882662396ed14fa7b1b" exitCode=0 Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.895113 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerDied","Data":"a9a2298c9c3178cc4c750a94b6c40932f39b9edae3080882662396ed14fa7b1b"} Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.898021 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerStarted","Data":"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676"} Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.905205 4902 generic.go:334] "Generic (PLEG): container finished" podID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerID="398ccb668b391ae1688821308e4ff4b2bd0d8e2eede3e49a7e51b45870c2c287" exitCode=0 Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.905285 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerDied","Data":"398ccb668b391ae1688821308e4ff4b2bd0d8e2eede3e49a7e51b45870c2c287"} Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.911678 4902 generic.go:334] "Generic (PLEG): container finished" podID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerID="bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f" exitCode=0 Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.911716 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerDied","Data":"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f"} Oct 09 13:52:38 crc kubenswrapper[4902]: I1009 13:52:38.997223 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p6dtr" podStartSLOduration=2.005716444 podStartE2EDuration="59.997206333s" podCreationTimestamp="2025-10-09 13:51:39 +0000 UTC" firstStartedPulling="2025-10-09 13:51:40.321257631 +0000 UTC m=+47.519116695" lastFinishedPulling="2025-10-09 13:52:38.31274752 +0000 UTC m=+105.510606584" observedRunningTime="2025-10-09 13:52:38.995868033 +0000 UTC m=+106.193727097" watchObservedRunningTime="2025-10-09 13:52:38.997206333 +0000 UTC m=+106.195065387" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.563094 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.563388 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.611402 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.922102 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerStarted","Data":"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950"} Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.925374 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerStarted","Data":"0591b7b7c43f46bfbd0de90d232782ba2d33e0e9c2d94e17e13658eb20221c00"} Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.930391 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerStarted","Data":"f9fbbf38d27d0ac740520065c7546a4608bb4822b3895f01847b6503d86504c0"} Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.933575 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerStarted","Data":"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809"} Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.946887 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6zbr" podStartSLOduration=2.829318678 podStartE2EDuration="1m2.94686881s" podCreationTimestamp="2025-10-09 13:51:37 +0000 UTC" firstStartedPulling="2025-10-09 13:51:39.260569243 +0000 UTC m=+46.458428307" lastFinishedPulling="2025-10-09 13:52:39.378119375 +0000 UTC m=+106.575978439" observedRunningTime="2025-10-09 13:52:39.944434148 +0000 UTC m=+107.142293222" watchObservedRunningTime="2025-10-09 13:52:39.94686881 +0000 UTC m=+107.144727874" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.960327 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8r47d" podStartSLOduration=1.732687389 podStartE2EDuration="59.960311478s" podCreationTimestamp="2025-10-09 13:51:40 +0000 UTC" firstStartedPulling="2025-10-09 13:51:41.344002677 +0000 UTC m=+48.541861741" lastFinishedPulling="2025-10-09 13:52:39.571626766 +0000 UTC m=+106.769485830" observedRunningTime="2025-10-09 13:52:39.960120963 +0000 UTC m=+107.157980027" watchObservedRunningTime="2025-10-09 13:52:39.960311478 +0000 UTC m=+107.158170542" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.982846 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.982902 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.994684 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rsvbq" podStartSLOduration=3.135697707 podStartE2EDuration="59.994666476s" podCreationTimestamp="2025-10-09 13:51:40 +0000 UTC" firstStartedPulling="2025-10-09 13:51:42.403334706 +0000 UTC m=+49.601193770" lastFinishedPulling="2025-10-09 13:52:39.262303465 +0000 UTC m=+106.460162539" observedRunningTime="2025-10-09 13:52:39.978159717 +0000 UTC m=+107.176018781" watchObservedRunningTime="2025-10-09 13:52:39.994666476 +0000 UTC m=+107.192525540" Oct 09 13:52:39 crc kubenswrapper[4902]: I1009 13:52:39.996145 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wlbhl" podStartSLOduration=2.796657683 podStartE2EDuration="1m2.99613921s" podCreationTimestamp="2025-10-09 13:51:37 +0000 UTC" firstStartedPulling="2025-10-09 13:51:39.288582904 +0000 UTC m=+46.486441968" lastFinishedPulling="2025-10-09 13:52:39.488064431 +0000 UTC m=+106.685923495" observedRunningTime="2025-10-09 13:52:39.995642115 +0000 UTC m=+107.193501179" watchObservedRunningTime="2025-10-09 13:52:39.99613921 +0000 UTC m=+107.193998274" Oct 09 13:52:40 crc kubenswrapper[4902]: I1009 13:52:40.036569 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:40 crc kubenswrapper[4902]: I1009 13:52:40.756073 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:52:40 crc kubenswrapper[4902]: I1009 13:52:40.756454 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:52:41 crc kubenswrapper[4902]: I1009 13:52:41.147541 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:41 crc kubenswrapper[4902]: I1009 13:52:41.147793 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:41 crc kubenswrapper[4902]: I1009 13:52:41.794289 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8r47d" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="registry-server" probeResult="failure" output=< Oct 09 13:52:41 crc kubenswrapper[4902]: timeout: failed to connect service ":50051" within 1s Oct 09 13:52:41 crc kubenswrapper[4902]: > Oct 09 13:52:42 crc kubenswrapper[4902]: I1009 13:52:42.187192 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rsvbq" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="registry-server" probeResult="failure" output=< Oct 09 13:52:42 crc kubenswrapper[4902]: timeout: failed to connect service ":50051" within 1s Oct 09 13:52:42 crc kubenswrapper[4902]: > Oct 09 13:52:47 crc kubenswrapper[4902]: I1009 13:52:47.778783 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:52:47 crc kubenswrapper[4902]: I1009 13:52:47.779100 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:52:47 crc kubenswrapper[4902]: I1009 13:52:47.815338 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:52:47 crc kubenswrapper[4902]: I1009 13:52:47.987765 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:47 crc kubenswrapper[4902]: I1009 13:52:47.988199 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:48 crc kubenswrapper[4902]: I1009 13:52:48.007278 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:52:48 crc kubenswrapper[4902]: I1009 13:52:48.037687 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:49 crc kubenswrapper[4902]: I1009 13:52:49.043439 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:49 crc kubenswrapper[4902]: I1009 13:52:49.602536 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:52:50 crc kubenswrapper[4902]: I1009 13:52:50.027816 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:50 crc kubenswrapper[4902]: I1009 13:52:50.444109 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:52:50 crc kubenswrapper[4902]: I1009 13:52:50.823879 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:52:50 crc kubenswrapper[4902]: I1009 13:52:50.873265 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:52:50 crc kubenswrapper[4902]: I1009 13:52:50.924360 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:52:51 crc kubenswrapper[4902]: I1009 13:52:51.193625 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:51 crc kubenswrapper[4902]: I1009 13:52:51.236128 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:51 crc kubenswrapper[4902]: I1009 13:52:51.845610 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:52:51 crc kubenswrapper[4902]: I1009 13:52:51.846206 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p72x4" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="registry-server" containerID="cri-o://7288abee2650e0b02facf3e853632ac1f713159a86eb3f2726d66deafea24336" gracePeriod=2 Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.005779 4902 generic.go:334] "Generic (PLEG): container finished" podID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerID="7288abee2650e0b02facf3e853632ac1f713159a86eb3f2726d66deafea24336" exitCode=0 Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.006014 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6zbr" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="registry-server" containerID="cri-o://336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950" gracePeriod=2 Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.006321 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerDied","Data":"7288abee2650e0b02facf3e853632ac1f713159a86eb3f2726d66deafea24336"} Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.221962 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.308445 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities\") pod \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.308540 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content\") pod \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.308584 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrmm2\" (UniqueName: \"kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2\") pod \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\" (UID: \"583d6403-9f97-4d51-9a43-f0d2fedf80f2\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.309370 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities" (OuterVolumeSpecName: "utilities") pod "583d6403-9f97-4d51-9a43-f0d2fedf80f2" (UID: "583d6403-9f97-4d51-9a43-f0d2fedf80f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.314015 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2" (OuterVolumeSpecName: "kube-api-access-xrmm2") pod "583d6403-9f97-4d51-9a43-f0d2fedf80f2" (UID: "583d6403-9f97-4d51-9a43-f0d2fedf80f2"). InnerVolumeSpecName "kube-api-access-xrmm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.321885 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "583d6403-9f97-4d51-9a43-f0d2fedf80f2" (UID: "583d6403-9f97-4d51-9a43-f0d2fedf80f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.348608 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.409705 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities\") pod \"07049d7d-6aec-4446-bca4-51a81e2e6a29\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.409829 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fmh4\" (UniqueName: \"kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4\") pod \"07049d7d-6aec-4446-bca4-51a81e2e6a29\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.409900 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content\") pod \"07049d7d-6aec-4446-bca4-51a81e2e6a29\" (UID: \"07049d7d-6aec-4446-bca4-51a81e2e6a29\") " Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.410191 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrmm2\" (UniqueName: \"kubernetes.io/projected/583d6403-9f97-4d51-9a43-f0d2fedf80f2-kube-api-access-xrmm2\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.410216 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.410227 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583d6403-9f97-4d51-9a43-f0d2fedf80f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.410860 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities" (OuterVolumeSpecName: "utilities") pod "07049d7d-6aec-4446-bca4-51a81e2e6a29" (UID: "07049d7d-6aec-4446-bca4-51a81e2e6a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.413627 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4" (OuterVolumeSpecName: "kube-api-access-5fmh4") pod "07049d7d-6aec-4446-bca4-51a81e2e6a29" (UID: "07049d7d-6aec-4446-bca4-51a81e2e6a29"). InnerVolumeSpecName "kube-api-access-5fmh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.461471 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07049d7d-6aec-4446-bca4-51a81e2e6a29" (UID: "07049d7d-6aec-4446-bca4-51a81e2e6a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.511949 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.511995 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07049d7d-6aec-4446-bca4-51a81e2e6a29-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:52 crc kubenswrapper[4902]: I1009 13:52:52.512005 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fmh4\" (UniqueName: \"kubernetes.io/projected/07049d7d-6aec-4446-bca4-51a81e2e6a29-kube-api-access-5fmh4\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.012709 4902 generic.go:334] "Generic (PLEG): container finished" podID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerID="336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950" exitCode=0 Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.012789 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerDied","Data":"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950"} Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.012822 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6zbr" event={"ID":"07049d7d-6aec-4446-bca4-51a81e2e6a29","Type":"ContainerDied","Data":"0724c820c870ce65d0bba5984c5ece2bda35c25e407cfe81d32b988e0180ae97"} Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.012845 4902 scope.go:117] "RemoveContainer" containerID="336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.012992 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6zbr" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.016109 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p72x4" event={"ID":"583d6403-9f97-4d51-9a43-f0d2fedf80f2","Type":"ContainerDied","Data":"3c16c60cc3aa247ac868ab81c900bee02f27fa14f4e8781719b7045b41742306"} Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.016228 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p72x4" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.030554 4902 scope.go:117] "RemoveContainer" containerID="b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.051321 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.051946 4902 scope.go:117] "RemoveContainer" containerID="c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.053622 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p72x4"] Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.071590 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.071731 4902 scope.go:117] "RemoveContainer" containerID="336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950" Oct 09 13:52:53 crc kubenswrapper[4902]: E1009 13:52:53.072263 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950\": container with ID starting with 336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950 not found: ID does not exist" containerID="336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.072324 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950"} err="failed to get container status \"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950\": rpc error: code = NotFound desc = could not find container \"336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950\": container with ID starting with 336ba39a79ec4eca31100cd4b20170eb5b552605e4ef366738511c57fe564950 not found: ID does not exist" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.072355 4902 scope.go:117] "RemoveContainer" containerID="b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d" Oct 09 13:52:53 crc kubenswrapper[4902]: E1009 13:52:53.072768 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d\": container with ID starting with b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d not found: ID does not exist" containerID="b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.072838 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d"} err="failed to get container status \"b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d\": rpc error: code = NotFound desc = could not find container \"b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d\": container with ID starting with b3ce1e0c07bbf7a24a53f9e5a10efaa9e6becc321b6ecbb24402365aa99b0e6d not found: ID does not exist" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.072871 4902 scope.go:117] "RemoveContainer" containerID="c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480" Oct 09 13:52:53 crc kubenswrapper[4902]: E1009 13:52:53.073290 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480\": container with ID starting with c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480 not found: ID does not exist" containerID="c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.073327 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480"} err="failed to get container status \"c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480\": rpc error: code = NotFound desc = could not find container \"c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480\": container with ID starting with c2e07fe31bbc2d7abb8ac0a116aa63be815ee140a8b7962641bf69a6fbf70480 not found: ID does not exist" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.073347 4902 scope.go:117] "RemoveContainer" containerID="7288abee2650e0b02facf3e853632ac1f713159a86eb3f2726d66deafea24336" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.075158 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6zbr"] Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.092837 4902 scope.go:117] "RemoveContainer" containerID="a462a785a6d4b31a273e133ac761141a994cc879be8754fd6221a2df0ebddcef" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.104790 4902 scope.go:117] "RemoveContainer" containerID="27a0ec78f6db1f93dd5f6d029bab752ecdc679ecbdb37a0bf048d3751a5c6b76" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.538689 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" path="/var/lib/kubelet/pods/07049d7d-6aec-4446-bca4-51a81e2e6a29/volumes" Oct 09 13:52:53 crc kubenswrapper[4902]: I1009 13:52:53.539506 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" path="/var/lib/kubelet/pods/583d6403-9f97-4d51-9a43-f0d2fedf80f2/volumes" Oct 09 13:52:54 crc kubenswrapper[4902]: I1009 13:52:54.843747 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:52:54 crc kubenswrapper[4902]: I1009 13:52:54.844250 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rsvbq" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="registry-server" containerID="cri-o://0591b7b7c43f46bfbd0de90d232782ba2d33e0e9c2d94e17e13658eb20221c00" gracePeriod=2 Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.029142 4902 generic.go:334] "Generic (PLEG): container finished" podID="779ce800-3df4-4e67-8096-c2097269e1ad" containerID="0591b7b7c43f46bfbd0de90d232782ba2d33e0e9c2d94e17e13658eb20221c00" exitCode=0 Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.029194 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerDied","Data":"0591b7b7c43f46bfbd0de90d232782ba2d33e0e9c2d94e17e13658eb20221c00"} Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.211024 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.358528 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities\") pod \"779ce800-3df4-4e67-8096-c2097269e1ad\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.358588 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9f92\" (UniqueName: \"kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92\") pod \"779ce800-3df4-4e67-8096-c2097269e1ad\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.358658 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content\") pod \"779ce800-3df4-4e67-8096-c2097269e1ad\" (UID: \"779ce800-3df4-4e67-8096-c2097269e1ad\") " Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.359525 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities" (OuterVolumeSpecName: "utilities") pod "779ce800-3df4-4e67-8096-c2097269e1ad" (UID: "779ce800-3df4-4e67-8096-c2097269e1ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.374398 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92" (OuterVolumeSpecName: "kube-api-access-r9f92") pod "779ce800-3df4-4e67-8096-c2097269e1ad" (UID: "779ce800-3df4-4e67-8096-c2097269e1ad"). InnerVolumeSpecName "kube-api-access-r9f92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.437057 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "779ce800-3df4-4e67-8096-c2097269e1ad" (UID: "779ce800-3df4-4e67-8096-c2097269e1ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.460129 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.460186 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9f92\" (UniqueName: \"kubernetes.io/projected/779ce800-3df4-4e67-8096-c2097269e1ad-kube-api-access-r9f92\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:55 crc kubenswrapper[4902]: I1009 13:52:55.460205 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779ce800-3df4-4e67-8096-c2097269e1ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.035713 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rsvbq" event={"ID":"779ce800-3df4-4e67-8096-c2097269e1ad","Type":"ContainerDied","Data":"041c018e89fb371a7c7d66b77b3468a11d58f79c2e2058493d8d4c4e957395ec"} Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.035815 4902 scope.go:117] "RemoveContainer" containerID="0591b7b7c43f46bfbd0de90d232782ba2d33e0e9c2d94e17e13658eb20221c00" Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.035993 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rsvbq" Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.054249 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.055165 4902 scope.go:117] "RemoveContainer" containerID="a9a2298c9c3178cc4c750a94b6c40932f39b9edae3080882662396ed14fa7b1b" Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.056771 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rsvbq"] Oct 09 13:52:56 crc kubenswrapper[4902]: I1009 13:52:56.071040 4902 scope.go:117] "RemoveContainer" containerID="b8c1036bcfc4ca3b0ac71c4e92628e443c9084a4232ec90228ab8fa2ee9b478f" Oct 09 13:52:57 crc kubenswrapper[4902]: I1009 13:52:57.520401 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" path="/var/lib/kubelet/pods/779ce800-3df4-4e67-8096-c2097269e1ad/volumes" Oct 09 13:53:15 crc kubenswrapper[4902]: I1009 13:53:15.852318 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" podUID="28116576-5069-4dd6-90f1-31582eda88df" containerName="oauth-openshift" containerID="cri-o://3646583eb5c8362dab8b66fe78c8d41ebe941b346172d7b693e9cbdddaf9e908" gracePeriod=15 Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.163997 4902 generic.go:334] "Generic (PLEG): container finished" podID="28116576-5069-4dd6-90f1-31582eda88df" containerID="3646583eb5c8362dab8b66fe78c8d41ebe941b346172d7b693e9cbdddaf9e908" exitCode=0 Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.164039 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" event={"ID":"28116576-5069-4dd6-90f1-31582eda88df","Type":"ContainerDied","Data":"3646583eb5c8362dab8b66fe78c8d41ebe941b346172d7b693e9cbdddaf9e908"} Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.215751 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253472 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7dc5844c99-wml4w"] Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253754 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253770 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253787 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253796 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253808 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253819 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253829 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253837 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253846 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253854 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253867 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253874 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253885 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a89302a-15e3-470f-b938-cf4c1e97a44e" containerName="pruner" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253892 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a89302a-15e3-470f-b938-cf4c1e97a44e" containerName="pruner" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253903 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253911 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253920 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253927 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253939 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253946 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253957 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28116576-5069-4dd6-90f1-31582eda88df" containerName="oauth-openshift" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253966 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="28116576-5069-4dd6-90f1-31582eda88df" containerName="oauth-openshift" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253976 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253984 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="extract-content" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.253991 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.253999 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.254012 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254023 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="extract-utilities" Oct 09 13:53:16 crc kubenswrapper[4902]: E1009 13:53:16.254033 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254042 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254164 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a89302a-15e3-470f-b938-cf4c1e97a44e" containerName="pruner" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254178 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="28116576-5069-4dd6-90f1-31582eda88df" containerName="oauth-openshift" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254186 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="93785db6-c7f9-4d9e-9407-b3653a9aa360" containerName="kube-multus-additional-cni-plugins" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254196 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f26005-88e7-4f48-917e-5b2696925564" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254208 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="07049d7d-6aec-4446-bca4-51a81e2e6a29" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254219 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="583d6403-9f97-4d51-9a43-f0d2fedf80f2" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254231 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="779ce800-3df4-4e67-8096-c2097269e1ad" containerName="registry-server" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.254717 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.262817 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.262872 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.262938 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.262966 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwq5\" (UniqueName: \"kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263012 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263035 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263055 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263083 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263111 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263135 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263162 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263208 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263232 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263259 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir\") pod \"28116576-5069-4dd6-90f1-31582eda88df\" (UID: \"28116576-5069-4dd6-90f1-31582eda88df\") " Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.263709 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.264849 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.267598 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.267652 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.267692 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.269569 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.269857 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.270378 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.270618 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.271055 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5" (OuterVolumeSpecName: "kube-api-access-8nwq5") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "kube-api-access-8nwq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.271577 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.271628 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.271906 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.280918 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dc5844c99-wml4w"] Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.291083 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "28116576-5069-4dd6-90f1-31582eda88df" (UID: "28116576-5069-4dd6-90f1-31582eda88df"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364770 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-audit-policies\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364817 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364850 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364910 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364945 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364967 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.364994 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4999580-91e2-4614-8bf3-517808b42413-audit-dir\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365021 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365052 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365080 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365142 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365200 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365231 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365255 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxcx\" (UniqueName: \"kubernetes.io/projected/d4999580-91e2-4614-8bf3-517808b42413-kube-api-access-zqxcx\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365299 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365311 4902 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365321 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365332 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365341 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365351 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365361 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365372 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365381 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365391 4902 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28116576-5069-4dd6-90f1-31582eda88df-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365401 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365445 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365455 4902 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28116576-5069-4dd6-90f1-31582eda88df-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.365464 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwq5\" (UniqueName: \"kubernetes.io/projected/28116576-5069-4dd6-90f1-31582eda88df-kube-api-access-8nwq5\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467015 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467091 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467113 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467146 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxcx\" (UniqueName: \"kubernetes.io/projected/d4999580-91e2-4614-8bf3-517808b42413-kube-api-access-zqxcx\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467167 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-audit-policies\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467187 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467231 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467263 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467300 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467338 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467358 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4999580-91e2-4614-8bf3-517808b42413-audit-dir\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467382 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467402 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467441 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.467726 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4999580-91e2-4614-8bf3-517808b42413-audit-dir\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.468806 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.468858 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.468909 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.470104 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d4999580-91e2-4614-8bf3-517808b42413-audit-policies\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.471527 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-error\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.471620 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.471826 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.472389 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.472389 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.472734 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-session\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.473854 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.475234 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d4999580-91e2-4614-8bf3-517808b42413-v4-0-config-user-template-login\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.490462 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxcx\" (UniqueName: \"kubernetes.io/projected/d4999580-91e2-4614-8bf3-517808b42413-kube-api-access-zqxcx\") pod \"oauth-openshift-7dc5844c99-wml4w\" (UID: \"d4999580-91e2-4614-8bf3-517808b42413\") " pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.607527 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:16 crc kubenswrapper[4902]: I1009 13:53:16.810882 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dc5844c99-wml4w"] Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.170197 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" event={"ID":"d4999580-91e2-4614-8bf3-517808b42413","Type":"ContainerStarted","Data":"a0cc81f267343469856bc6cde57ec804e39c4b05b22c0075777cedf8c24ff7ea"} Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.170525 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" event={"ID":"d4999580-91e2-4614-8bf3-517808b42413","Type":"ContainerStarted","Data":"70f25e8dc76edff5abcbddbab0a2f16546639313ab31cc2d38fceee575ee2ba7"} Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.171078 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.172454 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" event={"ID":"28116576-5069-4dd6-90f1-31582eda88df","Type":"ContainerDied","Data":"9ec4731140976201e54c610e1fb018ac9a2ee063757bc2c754114f0afa68d10b"} Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.172492 4902 scope.go:117] "RemoveContainer" containerID="3646583eb5c8362dab8b66fe78c8d41ebe941b346172d7b693e9cbdddaf9e908" Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.172520 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jw7l8" Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.204149 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" podStartSLOduration=27.204125977 podStartE2EDuration="27.204125977s" podCreationTimestamp="2025-10-09 13:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:53:17.199631674 +0000 UTC m=+144.397490758" watchObservedRunningTime="2025-10-09 13:53:17.204125977 +0000 UTC m=+144.401985041" Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.220666 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.223182 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jw7l8"] Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.520780 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28116576-5069-4dd6-90f1-31582eda88df" path="/var/lib/kubelet/pods/28116576-5069-4dd6-90f1-31582eda88df/volumes" Oct 09 13:53:17 crc kubenswrapper[4902]: I1009 13:53:17.668641 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dc5844c99-wml4w" Oct 09 13:53:20 crc kubenswrapper[4902]: I1009 13:53:20.077908 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:53:20 crc kubenswrapper[4902]: I1009 13:53:20.078363 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.719393 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.720346 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-56j52" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="registry-server" containerID="cri-o://e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f" gracePeriod=30 Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.732356 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.732706 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wlbhl" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="registry-server" containerID="cri-o://e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809" gracePeriod=30 Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.749645 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.751622 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" containerID="cri-o://19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1" gracePeriod=30 Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.755501 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.755871 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p6dtr" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="registry-server" containerID="cri-o://434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676" gracePeriod=30 Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.767719 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb67d"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.768839 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.777200 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.777543 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8r47d" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="registry-server" containerID="cri-o://f9fbbf38d27d0ac740520065c7546a4608bb4822b3895f01847b6503d86504c0" gracePeriod=30 Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.792735 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb67d"] Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.948355 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.948497 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8hp\" (UniqueName: \"kubernetes.io/projected/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-kube-api-access-rh8hp\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:32 crc kubenswrapper[4902]: I1009 13:53:32.948545 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.049902 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.050310 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.050358 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8hp\" (UniqueName: \"kubernetes.io/projected/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-kube-api-access-rh8hp\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.052287 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.056782 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.065801 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8hp\" (UniqueName: \"kubernetes.io/projected/3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6-kube-api-access-rh8hp\") pod \"marketplace-operator-79b997595-bb67d\" (UID: \"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.091547 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.249671 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.261030 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.269169 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.303613 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.304236 4902 generic.go:334] "Generic (PLEG): container finished" podID="2a1edf05-690e-463e-8086-e4ba20653475" containerID="434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676" exitCode=0 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.304295 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerDied","Data":"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.304327 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p6dtr" event={"ID":"2a1edf05-690e-463e-8086-e4ba20653475","Type":"ContainerDied","Data":"cd01274f326fb7c79cc596a79b498221ec833e99ea9720ae16a6ca3c558063e9"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.304349 4902 scope.go:117] "RemoveContainer" containerID="434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.318995 4902 generic.go:334] "Generic (PLEG): container finished" podID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerID="f9fbbf38d27d0ac740520065c7546a4608bb4822b3895f01847b6503d86504c0" exitCode=0 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.319086 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerDied","Data":"f9fbbf38d27d0ac740520065c7546a4608bb4822b3895f01847b6503d86504c0"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.319115 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8r47d" event={"ID":"45688fd0-799f-477e-ae28-ef494a1abdc5","Type":"ContainerDied","Data":"41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.319126 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b9ec62880db93380e6974c63a1467e461d40806c7772f644399ae8c659aad5" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.325363 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.325712 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" event={"ID":"a2933475-7af8-41e3-9389-114c1969b030","Type":"ContainerDied","Data":"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.325686 4902 generic.go:334] "Generic (PLEG): container finished" podID="a2933475-7af8-41e3-9389-114c1969b030" containerID="19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1" exitCode=0 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.325652 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.325822 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvwhz" event={"ID":"a2933475-7af8-41e3-9389-114c1969b030","Type":"ContainerDied","Data":"0df32c7dca5eb011f8316801dcc478f0821eca5b355b56704fa9b1a9960dcca4"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.333956 4902 generic.go:334] "Generic (PLEG): container finished" podID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerID="e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809" exitCode=0 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.334095 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wlbhl" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.334462 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerDied","Data":"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.334520 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wlbhl" event={"ID":"cc1a8762-c90b-4ff8-a836-47ca3d3ec932","Type":"ContainerDied","Data":"06958e871a66a24029bb6b806ac6af5c359c6052d0f06adad29dfafeca3931f3"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.341893 4902 scope.go:117] "RemoveContainer" containerID="ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.344676 4902 generic.go:334] "Generic (PLEG): container finished" podID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerID="e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f" exitCode=0 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.344748 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerDied","Data":"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.344829 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-56j52" event={"ID":"3038206a-a63d-4cde-9d0e-9549cfb95ad7","Type":"ContainerDied","Data":"182b69bccfdd00eeda9c4d2f64dfc1c87cbe8d1a649942ef1ecdab1e15ab88b4"} Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.344951 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-56j52" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.354901 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities\") pod \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.354954 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities\") pod \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.354986 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content\") pod \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.355063 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbt82\" (UniqueName: \"kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82\") pod \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.355384 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwf6v\" (UniqueName: \"kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v\") pod \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\" (UID: \"3038206a-a63d-4cde-9d0e-9549cfb95ad7\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.356687 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities" (OuterVolumeSpecName: "utilities") pod "cc1a8762-c90b-4ff8-a836-47ca3d3ec932" (UID: "cc1a8762-c90b-4ff8-a836-47ca3d3ec932"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.356829 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities" (OuterVolumeSpecName: "utilities") pod "3038206a-a63d-4cde-9d0e-9549cfb95ad7" (UID: "3038206a-a63d-4cde-9d0e-9549cfb95ad7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.362520 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content\") pod \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\" (UID: \"cc1a8762-c90b-4ff8-a836-47ca3d3ec932\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.363061 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.363088 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.363271 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v" (OuterVolumeSpecName: "kube-api-access-xwf6v") pod "3038206a-a63d-4cde-9d0e-9549cfb95ad7" (UID: "3038206a-a63d-4cde-9d0e-9549cfb95ad7"). InnerVolumeSpecName "kube-api-access-xwf6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.363314 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82" (OuterVolumeSpecName: "kube-api-access-vbt82") pod "cc1a8762-c90b-4ff8-a836-47ca3d3ec932" (UID: "cc1a8762-c90b-4ff8-a836-47ca3d3ec932"). InnerVolumeSpecName "kube-api-access-vbt82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.370938 4902 scope.go:117] "RemoveContainer" containerID="e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.385236 4902 scope.go:117] "RemoveContainer" containerID="434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.385729 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676\": container with ID starting with 434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676 not found: ID does not exist" containerID="434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.385768 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676"} err="failed to get container status \"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676\": rpc error: code = NotFound desc = could not find container \"434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676\": container with ID starting with 434c1a1ea034644b9c54242fb1255c4ad4405a084670625a801b2acf377d1676 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.385796 4902 scope.go:117] "RemoveContainer" containerID="ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.386191 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f\": container with ID starting with ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f not found: ID does not exist" containerID="ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.386232 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f"} err="failed to get container status \"ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f\": rpc error: code = NotFound desc = could not find container \"ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f\": container with ID starting with ce18f2474ec9cc9a5117f1f84f09a69747c92868e2c805bf26f636d6df93852f not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.386273 4902 scope.go:117] "RemoveContainer" containerID="e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.386903 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd\": container with ID starting with e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd not found: ID does not exist" containerID="e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.387008 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd"} err="failed to get container status \"e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd\": rpc error: code = NotFound desc = could not find container \"e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd\": container with ID starting with e7f7ac8c90626093b32cf0478e3e46601110f49a78d5a1ff0c4bc5d59c5b62dd not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.387075 4902 scope.go:117] "RemoveContainer" containerID="19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.408089 4902 scope.go:117] "RemoveContainer" containerID="19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.408638 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1\": container with ID starting with 19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1 not found: ID does not exist" containerID="19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.408688 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1"} err="failed to get container status \"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1\": rpc error: code = NotFound desc = could not find container \"19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1\": container with ID starting with 19ea7ce8a01477caf6d461c02d24b32e1db0481bec2b7ce8efe32b8a833166a1 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.408720 4902 scope.go:117] "RemoveContainer" containerID="e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.427461 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3038206a-a63d-4cde-9d0e-9549cfb95ad7" (UID: "3038206a-a63d-4cde-9d0e-9549cfb95ad7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.428970 4902 scope.go:117] "RemoveContainer" containerID="bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.439013 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc1a8762-c90b-4ff8-a836-47ca3d3ec932" (UID: "cc1a8762-c90b-4ff8-a836-47ca3d3ec932"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.446057 4902 scope.go:117] "RemoveContainer" containerID="9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.458555 4902 scope.go:117] "RemoveContainer" containerID="e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.458915 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809\": container with ID starting with e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809 not found: ID does not exist" containerID="e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.458946 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809"} err="failed to get container status \"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809\": rpc error: code = NotFound desc = could not find container \"e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809\": container with ID starting with e4e63a739f8ab775d2f779931cfc1b9e0b7d07c0004ba8779df13229c1fa7809 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.458980 4902 scope.go:117] "RemoveContainer" containerID="bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.459263 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f\": container with ID starting with bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f not found: ID does not exist" containerID="bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.459297 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f"} err="failed to get container status \"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f\": rpc error: code = NotFound desc = could not find container \"bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f\": container with ID starting with bfcc06dc33ffb3e56b92a3ee24787960cc51a159657668b944f2309f5617962f not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.459321 4902 scope.go:117] "RemoveContainer" containerID="9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.459787 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42\": container with ID starting with 9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42 not found: ID does not exist" containerID="9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.459873 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42"} err="failed to get container status \"9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42\": rpc error: code = NotFound desc = could not find container \"9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42\": container with ID starting with 9ce9706c136a6bf757e56cb351962ccd980cf39964dba51108431712d06dce42 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.459924 4902 scope.go:117] "RemoveContainer" containerID="e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.467874 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities\") pod \"45688fd0-799f-477e-ae28-ef494a1abdc5\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.467912 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities\") pod \"2a1edf05-690e-463e-8086-e4ba20653475\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.467982 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca\") pod \"a2933475-7af8-41e3-9389-114c1969b030\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468030 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrdt\" (UniqueName: \"kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt\") pod \"2a1edf05-690e-463e-8086-e4ba20653475\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468069 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics\") pod \"a2933475-7af8-41e3-9389-114c1969b030\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468098 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krqkn\" (UniqueName: \"kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn\") pod \"45688fd0-799f-477e-ae28-ef494a1abdc5\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468129 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content\") pod \"2a1edf05-690e-463e-8086-e4ba20653475\" (UID: \"2a1edf05-690e-463e-8086-e4ba20653475\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468166 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content\") pod \"45688fd0-799f-477e-ae28-ef494a1abdc5\" (UID: \"45688fd0-799f-477e-ae28-ef494a1abdc5\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468189 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbtb\" (UniqueName: \"kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb\") pod \"a2933475-7af8-41e3-9389-114c1969b030\" (UID: \"a2933475-7af8-41e3-9389-114c1969b030\") " Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468453 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468475 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3038206a-a63d-4cde-9d0e-9549cfb95ad7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468488 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbt82\" (UniqueName: \"kubernetes.io/projected/cc1a8762-c90b-4ff8-a836-47ca3d3ec932-kube-api-access-vbt82\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468502 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwf6v\" (UniqueName: \"kubernetes.io/projected/3038206a-a63d-4cde-9d0e-9549cfb95ad7-kube-api-access-xwf6v\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468679 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a2933475-7af8-41e3-9389-114c1969b030" (UID: "a2933475-7af8-41e3-9389-114c1969b030"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468911 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities" (OuterVolumeSpecName: "utilities") pod "2a1edf05-690e-463e-8086-e4ba20653475" (UID: "2a1edf05-690e-463e-8086-e4ba20653475"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.468763 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities" (OuterVolumeSpecName: "utilities") pod "45688fd0-799f-477e-ae28-ef494a1abdc5" (UID: "45688fd0-799f-477e-ae28-ef494a1abdc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.472552 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn" (OuterVolumeSpecName: "kube-api-access-krqkn") pod "45688fd0-799f-477e-ae28-ef494a1abdc5" (UID: "45688fd0-799f-477e-ae28-ef494a1abdc5"). InnerVolumeSpecName "kube-api-access-krqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.472559 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt" (OuterVolumeSpecName: "kube-api-access-2wrdt") pod "2a1edf05-690e-463e-8086-e4ba20653475" (UID: "2a1edf05-690e-463e-8086-e4ba20653475"). InnerVolumeSpecName "kube-api-access-2wrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.472974 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb" (OuterVolumeSpecName: "kube-api-access-hcbtb") pod "a2933475-7af8-41e3-9389-114c1969b030" (UID: "a2933475-7af8-41e3-9389-114c1969b030"). InnerVolumeSpecName "kube-api-access-hcbtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.473035 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a2933475-7af8-41e3-9389-114c1969b030" (UID: "a2933475-7af8-41e3-9389-114c1969b030"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.495180 4902 scope.go:117] "RemoveContainer" containerID="a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.498458 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a1edf05-690e-463e-8086-e4ba20653475" (UID: "2a1edf05-690e-463e-8086-e4ba20653475"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.509353 4902 scope.go:117] "RemoveContainer" containerID="fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.530576 4902 scope.go:117] "RemoveContainer" containerID="e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.531222 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f\": container with ID starting with e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f not found: ID does not exist" containerID="e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.531282 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f"} err="failed to get container status \"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f\": rpc error: code = NotFound desc = could not find container \"e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f\": container with ID starting with e3a4a4633be1c4640b8111acd8147d1ca4dc75ed2ff6e7c9ecbec5f56fc1c48f not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.531318 4902 scope.go:117] "RemoveContainer" containerID="a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.531578 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9\": container with ID starting with a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9 not found: ID does not exist" containerID="a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.531597 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9"} err="failed to get container status \"a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9\": rpc error: code = NotFound desc = could not find container \"a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9\": container with ID starting with a4c911f894e25315a759ec91ed0eb0ab08b307b54154a2967952d35afa0fcbe9 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.531610 4902 scope.go:117] "RemoveContainer" containerID="fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1" Oct 09 13:53:33 crc kubenswrapper[4902]: E1009 13:53:33.532052 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1\": container with ID starting with fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1 not found: ID does not exist" containerID="fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.532104 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1"} err="failed to get container status \"fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1\": rpc error: code = NotFound desc = could not find container \"fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1\": container with ID starting with fbaccf1b19c90e0dc532b1b7c3a552e732f86f1e8bc7f806517d9d4103c816b1 not found: ID does not exist" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.546721 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bb67d"] Oct 09 13:53:33 crc kubenswrapper[4902]: W1009 13:53:33.551843 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b0cc6d4_31a9_4f2f_90ab_2cb6676e61b6.slice/crio-9b31856423ebf52302b963a761d0e22f5ebd73c0b28dc5ef797da2798efb9c06 WatchSource:0}: Error finding container 9b31856423ebf52302b963a761d0e22f5ebd73c0b28dc5ef797da2798efb9c06: Status 404 returned error can't find the container with id 9b31856423ebf52302b963a761d0e22f5ebd73c0b28dc5ef797da2798efb9c06 Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.569959 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570019 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570035 4902 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2933475-7af8-41e3-9389-114c1969b030-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570049 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrdt\" (UniqueName: \"kubernetes.io/projected/2a1edf05-690e-463e-8086-e4ba20653475-kube-api-access-2wrdt\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570086 4902 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a2933475-7af8-41e3-9389-114c1969b030-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570103 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krqkn\" (UniqueName: \"kubernetes.io/projected/45688fd0-799f-477e-ae28-ef494a1abdc5-kube-api-access-krqkn\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570114 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1edf05-690e-463e-8086-e4ba20653475-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.570174 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbtb\" (UniqueName: \"kubernetes.io/projected/a2933475-7af8-41e3-9389-114c1969b030-kube-api-access-hcbtb\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.573523 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45688fd0-799f-477e-ae28-ef494a1abdc5" (UID: "45688fd0-799f-477e-ae28-ef494a1abdc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.658606 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.662814 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvwhz"] Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.671931 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45688fd0-799f-477e-ae28-ef494a1abdc5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.680056 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.687489 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-56j52"] Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.695334 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:53:33 crc kubenswrapper[4902]: I1009 13:53:33.698519 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wlbhl"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.352194 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" event={"ID":"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6","Type":"ContainerStarted","Data":"e21f286c0219964f6679ba5181da058651695292273f24ce35e8e712f273d10b"} Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.352486 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.352500 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" event={"ID":"3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6","Type":"ContainerStarted","Data":"9b31856423ebf52302b963a761d0e22f5ebd73c0b28dc5ef797da2798efb9c06"} Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.355265 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.355466 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p6dtr" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.358402 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8r47d" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.372871 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bb67d" podStartSLOduration=2.372839362 podStartE2EDuration="2.372839362s" podCreationTimestamp="2025-10-09 13:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:53:34.368129833 +0000 UTC m=+161.565988937" watchObservedRunningTime="2025-10-09 13:53:34.372839362 +0000 UTC m=+161.570698436" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.402544 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.407248 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p6dtr"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.415462 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.425313 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8r47d"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741196 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpxl7"] Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741389 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741400 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741421 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741427 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741435 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741441 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741451 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741456 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741465 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741470 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741480 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741485 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741495 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741501 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741509 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741514 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741526 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741532 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741542 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741548 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741558 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741564 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741574 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741581 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="extract-utilities" Oct 09 13:53:34 crc kubenswrapper[4902]: E1009 13:53:34.741588 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741593 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="extract-content" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741682 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2933475-7af8-41e3-9389-114c1969b030" containerName="marketplace-operator" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741701 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741708 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1edf05-690e-463e-8086-e4ba20653475" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741718 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.741727 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" containerName="registry-server" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.742552 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.744150 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.755638 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpxl7"] Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.888106 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-catalog-content\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.888195 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-utilities\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.888229 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzfmc\" (UniqueName: \"kubernetes.io/projected/0aa7377e-9f5a-411d-a20d-a134a5735eda-kube-api-access-mzfmc\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.989669 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-utilities\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.989728 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzfmc\" (UniqueName: \"kubernetes.io/projected/0aa7377e-9f5a-411d-a20d-a134a5735eda-kube-api-access-mzfmc\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.989808 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-catalog-content\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.990227 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-utilities\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:34 crc kubenswrapper[4902]: I1009 13:53:34.990349 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0aa7377e-9f5a-411d-a20d-a134a5735eda-catalog-content\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.006642 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzfmc\" (UniqueName: \"kubernetes.io/projected/0aa7377e-9f5a-411d-a20d-a134a5735eda-kube-api-access-mzfmc\") pod \"certified-operators-hpxl7\" (UID: \"0aa7377e-9f5a-411d-a20d-a134a5735eda\") " pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.056712 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.341625 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mq2n8"] Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.343736 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.346454 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.352100 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq2n8"] Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.459501 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpxl7"] Oct 09 13:53:35 crc kubenswrapper[4902]: W1009 13:53:35.467961 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa7377e_9f5a_411d_a20d_a134a5735eda.slice/crio-33d9a907a301ba33e53969df99fdd28d52549f0a0b22933755de7973d22efa93 WatchSource:0}: Error finding container 33d9a907a301ba33e53969df99fdd28d52549f0a0b22933755de7973d22efa93: Status 404 returned error can't find the container with id 33d9a907a301ba33e53969df99fdd28d52549f0a0b22933755de7973d22efa93 Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.496537 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-catalog-content\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.496637 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdxbb\" (UniqueName: \"kubernetes.io/projected/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-kube-api-access-wdxbb\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.496662 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-utilities\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.520902 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1edf05-690e-463e-8086-e4ba20653475" path="/var/lib/kubelet/pods/2a1edf05-690e-463e-8086-e4ba20653475/volumes" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.521682 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3038206a-a63d-4cde-9d0e-9549cfb95ad7" path="/var/lib/kubelet/pods/3038206a-a63d-4cde-9d0e-9549cfb95ad7/volumes" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.522454 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45688fd0-799f-477e-ae28-ef494a1abdc5" path="/var/lib/kubelet/pods/45688fd0-799f-477e-ae28-ef494a1abdc5/volumes" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.524323 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2933475-7af8-41e3-9389-114c1969b030" path="/var/lib/kubelet/pods/a2933475-7af8-41e3-9389-114c1969b030/volumes" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.524929 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1a8762-c90b-4ff8-a836-47ca3d3ec932" path="/var/lib/kubelet/pods/cc1a8762-c90b-4ff8-a836-47ca3d3ec932/volumes" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.597876 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-catalog-content\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.597941 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdxbb\" (UniqueName: \"kubernetes.io/projected/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-kube-api-access-wdxbb\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.597961 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-utilities\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.598717 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-utilities\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.598787 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-catalog-content\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.617913 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdxbb\" (UniqueName: \"kubernetes.io/projected/3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e-kube-api-access-wdxbb\") pod \"redhat-marketplace-mq2n8\" (UID: \"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e\") " pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:35 crc kubenswrapper[4902]: I1009 13:53:35.671820 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.059726 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mq2n8"] Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.369625 4902 generic.go:334] "Generic (PLEG): container finished" podID="3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e" containerID="609243898e4a4bf51b71b190bfbe59a463557981eee6934a6d6911b83657d85c" exitCode=0 Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.369702 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq2n8" event={"ID":"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e","Type":"ContainerDied","Data":"609243898e4a4bf51b71b190bfbe59a463557981eee6934a6d6911b83657d85c"} Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.370068 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq2n8" event={"ID":"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e","Type":"ContainerStarted","Data":"9dff4f1854d97e373ac6b50e6c43e69438905b3951bf1b555ba956e4f114f94f"} Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.372095 4902 generic.go:334] "Generic (PLEG): container finished" podID="0aa7377e-9f5a-411d-a20d-a134a5735eda" containerID="015fb6dddb28fba4219bfdf0510524a599185f35582d0e94935fd752442e12ce" exitCode=0 Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.372143 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpxl7" event={"ID":"0aa7377e-9f5a-411d-a20d-a134a5735eda","Type":"ContainerDied","Data":"015fb6dddb28fba4219bfdf0510524a599185f35582d0e94935fd752442e12ce"} Oct 09 13:53:36 crc kubenswrapper[4902]: I1009 13:53:36.372179 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpxl7" event={"ID":"0aa7377e-9f5a-411d-a20d-a134a5735eda","Type":"ContainerStarted","Data":"33d9a907a301ba33e53969df99fdd28d52549f0a0b22933755de7973d22efa93"} Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.149765 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z5k86"] Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.153751 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.153908 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z5k86"] Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.157913 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.326775 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zwvl\" (UniqueName: \"kubernetes.io/projected/1d982b85-de80-4c77-82fc-8c4622cbd203-kube-api-access-8zwvl\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.326860 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-catalog-content\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.326907 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-utilities\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.380921 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpxl7" event={"ID":"0aa7377e-9f5a-411d-a20d-a134a5735eda","Type":"ContainerStarted","Data":"91edd93a3177422c54c6f1c84a9f2ac0bd85ce4445069edd3dbd40b095716dbe"} Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.427989 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zwvl\" (UniqueName: \"kubernetes.io/projected/1d982b85-de80-4c77-82fc-8c4622cbd203-kube-api-access-8zwvl\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.428067 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-catalog-content\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.428093 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-utilities\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.428794 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-catalog-content\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.429092 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d982b85-de80-4c77-82fc-8c4622cbd203-utilities\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.464545 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zwvl\" (UniqueName: \"kubernetes.io/projected/1d982b85-de80-4c77-82fc-8c4622cbd203-kube-api-access-8zwvl\") pod \"redhat-operators-z5k86\" (UID: \"1d982b85-de80-4c77-82fc-8c4622cbd203\") " pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.483234 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.736107 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j2rjw"] Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.738895 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.743196 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.749346 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2rjw"] Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.834943 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-catalog-content\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.835075 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glx7t\" (UniqueName: \"kubernetes.io/projected/13822386-0dae-4414-b5e3-f2bc758f6948-kube-api-access-glx7t\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.835244 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-utilities\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.889282 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z5k86"] Oct 09 13:53:37 crc kubenswrapper[4902]: W1009 13:53:37.897243 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d982b85_de80_4c77_82fc_8c4622cbd203.slice/crio-e95024b35c47f3ae5480f35bb05ff4aae22b78cd5d899131a055667324fced10 WatchSource:0}: Error finding container e95024b35c47f3ae5480f35bb05ff4aae22b78cd5d899131a055667324fced10: Status 404 returned error can't find the container with id e95024b35c47f3ae5480f35bb05ff4aae22b78cd5d899131a055667324fced10 Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.937027 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-utilities\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.937101 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-catalog-content\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.937146 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glx7t\" (UniqueName: \"kubernetes.io/projected/13822386-0dae-4414-b5e3-f2bc758f6948-kube-api-access-glx7t\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.937803 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-catalog-content\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.939628 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13822386-0dae-4414-b5e3-f2bc758f6948-utilities\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:37 crc kubenswrapper[4902]: I1009 13:53:37.960314 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glx7t\" (UniqueName: \"kubernetes.io/projected/13822386-0dae-4414-b5e3-f2bc758f6948-kube-api-access-glx7t\") pod \"community-operators-j2rjw\" (UID: \"13822386-0dae-4414-b5e3-f2bc758f6948\") " pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.113800 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.387827 4902 generic.go:334] "Generic (PLEG): container finished" podID="1d982b85-de80-4c77-82fc-8c4622cbd203" containerID="196baba1976f4bdbb04e79f0165f09deeef91d44afd5799be15854da9cb93c42" exitCode=0 Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.388026 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5k86" event={"ID":"1d982b85-de80-4c77-82fc-8c4622cbd203","Type":"ContainerDied","Data":"196baba1976f4bdbb04e79f0165f09deeef91d44afd5799be15854da9cb93c42"} Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.388098 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5k86" event={"ID":"1d982b85-de80-4c77-82fc-8c4622cbd203","Type":"ContainerStarted","Data":"e95024b35c47f3ae5480f35bb05ff4aae22b78cd5d899131a055667324fced10"} Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.392008 4902 generic.go:334] "Generic (PLEG): container finished" podID="3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e" containerID="43323a2603f8fdee6558ef567b80f988622e197db42cfbaf1bff6620af832e06" exitCode=0 Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.392114 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq2n8" event={"ID":"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e","Type":"ContainerDied","Data":"43323a2603f8fdee6558ef567b80f988622e197db42cfbaf1bff6620af832e06"} Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.395648 4902 generic.go:334] "Generic (PLEG): container finished" podID="0aa7377e-9f5a-411d-a20d-a134a5735eda" containerID="91edd93a3177422c54c6f1c84a9f2ac0bd85ce4445069edd3dbd40b095716dbe" exitCode=0 Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.395679 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpxl7" event={"ID":"0aa7377e-9f5a-411d-a20d-a134a5735eda","Type":"ContainerDied","Data":"91edd93a3177422c54c6f1c84a9f2ac0bd85ce4445069edd3dbd40b095716dbe"} Oct 09 13:53:38 crc kubenswrapper[4902]: I1009 13:53:38.505075 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2rjw"] Oct 09 13:53:38 crc kubenswrapper[4902]: W1009 13:53:38.513229 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13822386_0dae_4414_b5e3_f2bc758f6948.slice/crio-6d95885c8279d9ba8a7e9b8c166ebcd317ce0dac33dab2099a8778509eff1731 WatchSource:0}: Error finding container 6d95885c8279d9ba8a7e9b8c166ebcd317ce0dac33dab2099a8778509eff1731: Status 404 returned error can't find the container with id 6d95885c8279d9ba8a7e9b8c166ebcd317ce0dac33dab2099a8778509eff1731 Oct 09 13:53:39 crc kubenswrapper[4902]: I1009 13:53:39.404427 4902 generic.go:334] "Generic (PLEG): container finished" podID="13822386-0dae-4414-b5e3-f2bc758f6948" containerID="442543776d42fb41d16020b7f4a6ff925f833ae4882c92807068f921c9b25c44" exitCode=0 Oct 09 13:53:39 crc kubenswrapper[4902]: I1009 13:53:39.404531 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2rjw" event={"ID":"13822386-0dae-4414-b5e3-f2bc758f6948","Type":"ContainerDied","Data":"442543776d42fb41d16020b7f4a6ff925f833ae4882c92807068f921c9b25c44"} Oct 09 13:53:39 crc kubenswrapper[4902]: I1009 13:53:39.405016 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2rjw" event={"ID":"13822386-0dae-4414-b5e3-f2bc758f6948","Type":"ContainerStarted","Data":"6d95885c8279d9ba8a7e9b8c166ebcd317ce0dac33dab2099a8778509eff1731"} Oct 09 13:53:39 crc kubenswrapper[4902]: I1009 13:53:39.409360 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mq2n8" event={"ID":"3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e","Type":"ContainerStarted","Data":"5840bf13c0658f1d2a9a5d992d41be75b44060e21e5f9f08ea5a37a421f87b90"} Oct 09 13:53:39 crc kubenswrapper[4902]: I1009 13:53:39.440746 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mq2n8" podStartSLOduration=2.008963787 podStartE2EDuration="4.440701281s" podCreationTimestamp="2025-10-09 13:53:35 +0000 UTC" firstStartedPulling="2025-10-09 13:53:36.370970112 +0000 UTC m=+163.568829186" lastFinishedPulling="2025-10-09 13:53:38.802707606 +0000 UTC m=+166.000566680" observedRunningTime="2025-10-09 13:53:39.439240408 +0000 UTC m=+166.637099492" watchObservedRunningTime="2025-10-09 13:53:39.440701281 +0000 UTC m=+166.638560345" Oct 09 13:53:40 crc kubenswrapper[4902]: I1009 13:53:40.420649 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpxl7" event={"ID":"0aa7377e-9f5a-411d-a20d-a134a5735eda","Type":"ContainerStarted","Data":"3370814c2575f5d8e8e529bc8f7bb74d7d256718681143bb1b09d0294635ece3"} Oct 09 13:53:40 crc kubenswrapper[4902]: I1009 13:53:40.423246 4902 generic.go:334] "Generic (PLEG): container finished" podID="1d982b85-de80-4c77-82fc-8c4622cbd203" containerID="60f24fb2b382cfc45ec6510591e2f774ef0803977fcadd0705472ffb860c13d3" exitCode=0 Oct 09 13:53:40 crc kubenswrapper[4902]: I1009 13:53:40.423896 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5k86" event={"ID":"1d982b85-de80-4c77-82fc-8c4622cbd203","Type":"ContainerDied","Data":"60f24fb2b382cfc45ec6510591e2f774ef0803977fcadd0705472ffb860c13d3"} Oct 09 13:53:40 crc kubenswrapper[4902]: I1009 13:53:40.449165 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpxl7" podStartSLOduration=3.537225484 podStartE2EDuration="6.449139959s" podCreationTimestamp="2025-10-09 13:53:34 +0000 UTC" firstStartedPulling="2025-10-09 13:53:36.374213828 +0000 UTC m=+163.572072892" lastFinishedPulling="2025-10-09 13:53:39.286128313 +0000 UTC m=+166.483987367" observedRunningTime="2025-10-09 13:53:40.443870343 +0000 UTC m=+167.641729407" watchObservedRunningTime="2025-10-09 13:53:40.449139959 +0000 UTC m=+167.646999033" Oct 09 13:53:41 crc kubenswrapper[4902]: I1009 13:53:41.430478 4902 generic.go:334] "Generic (PLEG): container finished" podID="13822386-0dae-4414-b5e3-f2bc758f6948" containerID="7c677dfd8dc103c406b79a98d00a45ce73ca94060c37f7ac2ba9f985a1bc8e63" exitCode=0 Oct 09 13:53:41 crc kubenswrapper[4902]: I1009 13:53:41.430565 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2rjw" event={"ID":"13822386-0dae-4414-b5e3-f2bc758f6948","Type":"ContainerDied","Data":"7c677dfd8dc103c406b79a98d00a45ce73ca94060c37f7ac2ba9f985a1bc8e63"} Oct 09 13:53:42 crc kubenswrapper[4902]: I1009 13:53:42.438653 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z5k86" event={"ID":"1d982b85-de80-4c77-82fc-8c4622cbd203","Type":"ContainerStarted","Data":"8d9da68b552a2b7bba452a190fe9fc55a2beaf3d2078a8c40330168301603efc"} Oct 09 13:53:42 crc kubenswrapper[4902]: I1009 13:53:42.440582 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2rjw" event={"ID":"13822386-0dae-4414-b5e3-f2bc758f6948","Type":"ContainerStarted","Data":"d6122ef006575f86f9a6c177c055e930742b5e6aa86b97c1a05cc1adff07e410"} Oct 09 13:53:42 crc kubenswrapper[4902]: I1009 13:53:42.460831 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z5k86" podStartSLOduration=2.927828121 podStartE2EDuration="5.460811802s" podCreationTimestamp="2025-10-09 13:53:37 +0000 UTC" firstStartedPulling="2025-10-09 13:53:38.389954681 +0000 UTC m=+165.587813745" lastFinishedPulling="2025-10-09 13:53:40.922938362 +0000 UTC m=+168.120797426" observedRunningTime="2025-10-09 13:53:42.456860605 +0000 UTC m=+169.654719679" watchObservedRunningTime="2025-10-09 13:53:42.460811802 +0000 UTC m=+169.658670866" Oct 09 13:53:42 crc kubenswrapper[4902]: I1009 13:53:42.484552 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j2rjw" podStartSLOduration=2.903269881 podStartE2EDuration="5.484533734s" podCreationTimestamp="2025-10-09 13:53:37 +0000 UTC" firstStartedPulling="2025-10-09 13:53:39.405925151 +0000 UTC m=+166.603784215" lastFinishedPulling="2025-10-09 13:53:41.987189004 +0000 UTC m=+169.185048068" observedRunningTime="2025-10-09 13:53:42.482123593 +0000 UTC m=+169.679982667" watchObservedRunningTime="2025-10-09 13:53:42.484533734 +0000 UTC m=+169.682392808" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.057213 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.057822 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.109396 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.507598 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpxl7" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.672844 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.672898 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:45 crc kubenswrapper[4902]: I1009 13:53:45.735910 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:46 crc kubenswrapper[4902]: I1009 13:53:46.515558 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mq2n8" Oct 09 13:53:47 crc kubenswrapper[4902]: I1009 13:53:47.483762 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:47 crc kubenswrapper[4902]: I1009 13:53:47.483846 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:47 crc kubenswrapper[4902]: I1009 13:53:47.530600 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:48 crc kubenswrapper[4902]: I1009 13:53:48.114793 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:48 crc kubenswrapper[4902]: I1009 13:53:48.115624 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:48 crc kubenswrapper[4902]: I1009 13:53:48.163848 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:48 crc kubenswrapper[4902]: I1009 13:53:48.524099 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j2rjw" Oct 09 13:53:48 crc kubenswrapper[4902]: I1009 13:53:48.529100 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z5k86" Oct 09 13:53:50 crc kubenswrapper[4902]: I1009 13:53:50.077808 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:53:50 crc kubenswrapper[4902]: I1009 13:53:50.077878 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.079290 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.080251 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.080349 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.081324 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.081403 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962" gracePeriod=600 Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.674964 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962" exitCode=0 Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.675062 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962"} Oct 09 13:54:20 crc kubenswrapper[4902]: I1009 13:54:20.675451 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876"} Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.504923 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hhs6r"] Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.506171 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.558969 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hhs6r"] Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615133 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615413 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-bound-sa-token\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615599 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnrh\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-kube-api-access-4tnrh\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615679 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-trusted-ca\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615840 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.615925 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-certificates\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.616008 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-tls\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.616088 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.638931 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.717525 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.717991 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718019 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-bound-sa-token\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718045 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnrh\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-kube-api-access-4tnrh\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718063 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-trusted-ca\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718098 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-certificates\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718130 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-tls\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.718780 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.719564 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-certificates\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.720379 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-trusted-ca\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.723334 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-registry-tls\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.723394 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.733785 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnrh\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-kube-api-access-4tnrh\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.735228 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0716877a-95c7-4ccb-b77c-d746fbd5a9d9-bound-sa-token\") pod \"image-registry-66df7c8f76-hhs6r\" (UID: \"0716877a-95c7-4ccb-b77c-d746fbd5a9d9\") " pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:18 crc kubenswrapper[4902]: I1009 13:56:18.821919 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:19 crc kubenswrapper[4902]: I1009 13:56:19.043829 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hhs6r"] Oct 09 13:56:19 crc kubenswrapper[4902]: I1009 13:56:19.361370 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" event={"ID":"0716877a-95c7-4ccb-b77c-d746fbd5a9d9","Type":"ContainerStarted","Data":"14a2911169d497a22dd3a3186f32db8a6646846f2177916c96abf5ee074cf8d8"} Oct 09 13:56:19 crc kubenswrapper[4902]: I1009 13:56:19.361472 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" event={"ID":"0716877a-95c7-4ccb-b77c-d746fbd5a9d9","Type":"ContainerStarted","Data":"80681506d700589e58c47969e3b4804ca028780057df434e8c3e42956e15bce2"} Oct 09 13:56:19 crc kubenswrapper[4902]: I1009 13:56:19.361585 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:19 crc kubenswrapper[4902]: I1009 13:56:19.384548 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" podStartSLOduration=1.384525775 podStartE2EDuration="1.384525775s" podCreationTimestamp="2025-10-09 13:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 13:56:19.379578499 +0000 UTC m=+326.577437573" watchObservedRunningTime="2025-10-09 13:56:19.384525775 +0000 UTC m=+326.582384849" Oct 09 13:56:20 crc kubenswrapper[4902]: I1009 13:56:20.078232 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:56:20 crc kubenswrapper[4902]: I1009 13:56:20.078317 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:56:38 crc kubenswrapper[4902]: I1009 13:56:38.830157 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hhs6r" Oct 09 13:56:38 crc kubenswrapper[4902]: I1009 13:56:38.885039 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:56:50 crc kubenswrapper[4902]: I1009 13:56:50.078162 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:56:50 crc kubenswrapper[4902]: I1009 13:56:50.078658 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:57:03 crc kubenswrapper[4902]: I1009 13:57:03.930338 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" podUID="836dca77-634b-42e7-bf76-74b582e0969d" containerName="registry" containerID="cri-o://259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd" gracePeriod=30 Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.243572 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.368742 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.368796 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.368988 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.369013 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.369056 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.369074 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.369119 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.369136 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7mk\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk\") pod \"836dca77-634b-42e7-bf76-74b582e0969d\" (UID: \"836dca77-634b-42e7-bf76-74b582e0969d\") " Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.370495 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.370629 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.377755 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.378493 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk" (OuterVolumeSpecName: "kube-api-access-2d7mk") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "kube-api-access-2d7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.381631 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.383244 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.388107 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.393485 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "836dca77-634b-42e7-bf76-74b582e0969d" (UID: "836dca77-634b-42e7-bf76-74b582e0969d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470469 4902 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470540 4902 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/836dca77-634b-42e7-bf76-74b582e0969d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470573 4902 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470599 4902 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470624 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/836dca77-634b-42e7-bf76-74b582e0969d-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470648 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7mk\" (UniqueName: \"kubernetes.io/projected/836dca77-634b-42e7-bf76-74b582e0969d-kube-api-access-2d7mk\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.470672 4902 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/836dca77-634b-42e7-bf76-74b582e0969d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.676064 4902 generic.go:334] "Generic (PLEG): container finished" podID="836dca77-634b-42e7-bf76-74b582e0969d" containerID="259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd" exitCode=0 Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.676133 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" event={"ID":"836dca77-634b-42e7-bf76-74b582e0969d","Type":"ContainerDied","Data":"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd"} Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.676747 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" event={"ID":"836dca77-634b-42e7-bf76-74b582e0969d","Type":"ContainerDied","Data":"dfe079df19624112dd38632188d2c8c87fbf6965ecd217b2e16192e5f2c5777a"} Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.676144 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q76zz" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.676805 4902 scope.go:117] "RemoveContainer" containerID="259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.697256 4902 scope.go:117] "RemoveContainer" containerID="259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd" Oct 09 13:57:04 crc kubenswrapper[4902]: E1009 13:57:04.697689 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd\": container with ID starting with 259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd not found: ID does not exist" containerID="259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.697729 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd"} err="failed to get container status \"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd\": rpc error: code = NotFound desc = could not find container \"259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd\": container with ID starting with 259c4bbb6f53b94e8b44f3fa5b3006b85b15a368ee9a05e69c683f524a41c5fd not found: ID does not exist" Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.722587 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:57:04 crc kubenswrapper[4902]: I1009 13:57:04.729429 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q76zz"] Oct 09 13:57:05 crc kubenswrapper[4902]: I1009 13:57:05.522505 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836dca77-634b-42e7-bf76-74b582e0969d" path="/var/lib/kubelet/pods/836dca77-634b-42e7-bf76-74b582e0969d/volumes" Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.078296 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.078884 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.078943 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.079652 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.079720 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876" gracePeriod=600 Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.771154 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876" exitCode=0 Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.771251 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876"} Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.771607 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b"} Oct 09 13:57:20 crc kubenswrapper[4902]: I1009 13:57:20.771639 4902 scope.go:117] "RemoveContainer" containerID="25cb721737318b049ef3fbc91c7fb9450b978d343bb8bb36c3e0257909b5b962" Oct 09 13:57:53 crc kubenswrapper[4902]: I1009 13:57:53.652257 4902 scope.go:117] "RemoveContainer" containerID="d669232d4275d19556a2dca2810b022f4f10f6af157feebfac38bc0051c3c8f4" Oct 09 13:58:53 crc kubenswrapper[4902]: I1009 13:58:53.696834 4902 scope.go:117] "RemoveContainer" containerID="f9fbbf38d27d0ac740520065c7546a4608bb4822b3895f01847b6503d86504c0" Oct 09 13:58:53 crc kubenswrapper[4902]: I1009 13:58:53.716864 4902 scope.go:117] "RemoveContainer" containerID="398ccb668b391ae1688821308e4ff4b2bd0d8e2eede3e49a7e51b45870c2c287" Oct 09 13:59:20 crc kubenswrapper[4902]: I1009 13:59:20.078177 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:59:20 crc kubenswrapper[4902]: I1009 13:59:20.078680 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.322059 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xklfb"] Oct 09 13:59:44 crc kubenswrapper[4902]: E1009 13:59:44.322880 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836dca77-634b-42e7-bf76-74b582e0969d" containerName="registry" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.322897 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="836dca77-634b-42e7-bf76-74b582e0969d" containerName="registry" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.323022 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="836dca77-634b-42e7-bf76-74b582e0969d" containerName="registry" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.323485 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.325851 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.326058 4902 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bbm57" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.326228 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.327340 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jhttd"] Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.328189 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-jhttd" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.330712 4902 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zxs2f" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.335753 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xklfb"] Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.342385 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jhttd"] Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.351253 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mn6px"] Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.353688 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.360998 4902 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7gskr" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.374142 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mn6px"] Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.455791 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbs9w\" (UniqueName: \"kubernetes.io/projected/85edb63a-b99a-48b7-bdf7-285b37466b22-kube-api-access-lbs9w\") pod \"cert-manager-cainjector-7f985d654d-xklfb\" (UID: \"85edb63a-b99a-48b7-bdf7-285b37466b22\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.455887 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/425830e3-71c9-4b86-86d3-3f49d61b6cab-kube-api-access-mk9nw\") pod \"cert-manager-webhook-5655c58dd6-mn6px\" (UID: \"425830e3-71c9-4b86-86d3-3f49d61b6cab\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.455974 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxk9m\" (UniqueName: \"kubernetes.io/projected/8ae8ae73-6077-47a8-b43e-e91ab13101e6-kube-api-access-zxk9m\") pod \"cert-manager-5b446d88c5-jhttd\" (UID: \"8ae8ae73-6077-47a8-b43e-e91ab13101e6\") " pod="cert-manager/cert-manager-5b446d88c5-jhttd" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.557399 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxk9m\" (UniqueName: \"kubernetes.io/projected/8ae8ae73-6077-47a8-b43e-e91ab13101e6-kube-api-access-zxk9m\") pod \"cert-manager-5b446d88c5-jhttd\" (UID: \"8ae8ae73-6077-47a8-b43e-e91ab13101e6\") " pod="cert-manager/cert-manager-5b446d88c5-jhttd" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.557621 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbs9w\" (UniqueName: \"kubernetes.io/projected/85edb63a-b99a-48b7-bdf7-285b37466b22-kube-api-access-lbs9w\") pod \"cert-manager-cainjector-7f985d654d-xklfb\" (UID: \"85edb63a-b99a-48b7-bdf7-285b37466b22\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.557668 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/425830e3-71c9-4b86-86d3-3f49d61b6cab-kube-api-access-mk9nw\") pod \"cert-manager-webhook-5655c58dd6-mn6px\" (UID: \"425830e3-71c9-4b86-86d3-3f49d61b6cab\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.581640 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9nw\" (UniqueName: \"kubernetes.io/projected/425830e3-71c9-4b86-86d3-3f49d61b6cab-kube-api-access-mk9nw\") pod \"cert-manager-webhook-5655c58dd6-mn6px\" (UID: \"425830e3-71c9-4b86-86d3-3f49d61b6cab\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.584714 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxk9m\" (UniqueName: \"kubernetes.io/projected/8ae8ae73-6077-47a8-b43e-e91ab13101e6-kube-api-access-zxk9m\") pod \"cert-manager-5b446d88c5-jhttd\" (UID: \"8ae8ae73-6077-47a8-b43e-e91ab13101e6\") " pod="cert-manager/cert-manager-5b446d88c5-jhttd" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.585278 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbs9w\" (UniqueName: \"kubernetes.io/projected/85edb63a-b99a-48b7-bdf7-285b37466b22-kube-api-access-lbs9w\") pod \"cert-manager-cainjector-7f985d654d-xklfb\" (UID: \"85edb63a-b99a-48b7-bdf7-285b37466b22\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.652060 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.666672 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-jhttd" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.676090 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.880521 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xklfb"] Oct 09 13:59:44 crc kubenswrapper[4902]: W1009 13:59:44.888646 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85edb63a_b99a_48b7_bdf7_285b37466b22.slice/crio-3449ce0dfc47d9028bc7c41d5a823617dfe28e2e24663a3e5ccdc6a0155b98e3 WatchSource:0}: Error finding container 3449ce0dfc47d9028bc7c41d5a823617dfe28e2e24663a3e5ccdc6a0155b98e3: Status 404 returned error can't find the container with id 3449ce0dfc47d9028bc7c41d5a823617dfe28e2e24663a3e5ccdc6a0155b98e3 Oct 09 13:59:44 crc kubenswrapper[4902]: I1009 13:59:44.897287 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 13:59:45 crc kubenswrapper[4902]: I1009 13:59:45.125242 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-jhttd"] Oct 09 13:59:45 crc kubenswrapper[4902]: I1009 13:59:45.129773 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mn6px"] Oct 09 13:59:45 crc kubenswrapper[4902]: W1009 13:59:45.129998 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod425830e3_71c9_4b86_86d3_3f49d61b6cab.slice/crio-71afe4539ce501bbc989c35d9e41a98971b181f57765b9da86dbe460520c28b6 WatchSource:0}: Error finding container 71afe4539ce501bbc989c35d9e41a98971b181f57765b9da86dbe460520c28b6: Status 404 returned error can't find the container with id 71afe4539ce501bbc989c35d9e41a98971b181f57765b9da86dbe460520c28b6 Oct 09 13:59:45 crc kubenswrapper[4902]: W1009 13:59:45.133272 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ae8ae73_6077_47a8_b43e_e91ab13101e6.slice/crio-5a0eb710a5d3a5f8bd02772060c61b43c4eb6a3c38a792d9cb7120b64e63837c WatchSource:0}: Error finding container 5a0eb710a5d3a5f8bd02772060c61b43c4eb6a3c38a792d9cb7120b64e63837c: Status 404 returned error can't find the container with id 5a0eb710a5d3a5f8bd02772060c61b43c4eb6a3c38a792d9cb7120b64e63837c Oct 09 13:59:45 crc kubenswrapper[4902]: I1009 13:59:45.622244 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" event={"ID":"85edb63a-b99a-48b7-bdf7-285b37466b22","Type":"ContainerStarted","Data":"3449ce0dfc47d9028bc7c41d5a823617dfe28e2e24663a3e5ccdc6a0155b98e3"} Oct 09 13:59:45 crc kubenswrapper[4902]: I1009 13:59:45.623998 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-jhttd" event={"ID":"8ae8ae73-6077-47a8-b43e-e91ab13101e6","Type":"ContainerStarted","Data":"5a0eb710a5d3a5f8bd02772060c61b43c4eb6a3c38a792d9cb7120b64e63837c"} Oct 09 13:59:45 crc kubenswrapper[4902]: I1009 13:59:45.625167 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" event={"ID":"425830e3-71c9-4b86-86d3-3f49d61b6cab","Type":"ContainerStarted","Data":"71afe4539ce501bbc989c35d9e41a98971b181f57765b9da86dbe460520c28b6"} Oct 09 13:59:47 crc kubenswrapper[4902]: I1009 13:59:47.636421 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" event={"ID":"85edb63a-b99a-48b7-bdf7-285b37466b22","Type":"ContainerStarted","Data":"f7451a9011fbc290d0e1e713f525072f99dcd05db0a969a373b0fe8c08358d5e"} Oct 09 13:59:47 crc kubenswrapper[4902]: I1009 13:59:47.655461 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-xklfb" podStartSLOduration=1.773186521 podStartE2EDuration="3.655444716s" podCreationTimestamp="2025-10-09 13:59:44 +0000 UTC" firstStartedPulling="2025-10-09 13:59:44.897030448 +0000 UTC m=+532.094889502" lastFinishedPulling="2025-10-09 13:59:46.779288623 +0000 UTC m=+533.977147697" observedRunningTime="2025-10-09 13:59:47.653779838 +0000 UTC m=+534.851638942" watchObservedRunningTime="2025-10-09 13:59:47.655444716 +0000 UTC m=+534.853303780" Oct 09 13:59:48 crc kubenswrapper[4902]: I1009 13:59:48.645825 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" event={"ID":"425830e3-71c9-4b86-86d3-3f49d61b6cab","Type":"ContainerStarted","Data":"12eb9c468a84888ca173e66cf6aeef75443b3ae22944f78cbb59559f557126e3"} Oct 09 13:59:48 crc kubenswrapper[4902]: I1009 13:59:48.645892 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:48 crc kubenswrapper[4902]: I1009 13:59:48.648566 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-jhttd" event={"ID":"8ae8ae73-6077-47a8-b43e-e91ab13101e6","Type":"ContainerStarted","Data":"69721627f9b8c42610dc8dbc200536949f7a800f9c998a80e10cd0b9e6c53f7a"} Oct 09 13:59:48 crc kubenswrapper[4902]: I1009 13:59:48.661201 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" podStartSLOduration=1.412889023 podStartE2EDuration="4.661136635s" podCreationTimestamp="2025-10-09 13:59:44 +0000 UTC" firstStartedPulling="2025-10-09 13:59:45.132136812 +0000 UTC m=+532.329995876" lastFinishedPulling="2025-10-09 13:59:48.380384424 +0000 UTC m=+535.578243488" observedRunningTime="2025-10-09 13:59:48.660490667 +0000 UTC m=+535.858349751" watchObservedRunningTime="2025-10-09 13:59:48.661136635 +0000 UTC m=+535.858995699" Oct 09 13:59:50 crc kubenswrapper[4902]: I1009 13:59:50.078747 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 13:59:50 crc kubenswrapper[4902]: I1009 13:59:50.079075 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.676641 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-jhttd" podStartSLOduration=7.376086 podStartE2EDuration="10.67661249s" podCreationTimestamp="2025-10-09 13:59:44 +0000 UTC" firstStartedPulling="2025-10-09 13:59:45.136314963 +0000 UTC m=+532.334174027" lastFinishedPulling="2025-10-09 13:59:48.436841453 +0000 UTC m=+535.634700517" observedRunningTime="2025-10-09 13:59:48.683667446 +0000 UTC m=+535.881526510" watchObservedRunningTime="2025-10-09 13:59:54.67661249 +0000 UTC m=+541.874471584" Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.679356 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-mn6px" Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.682087 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh6wc"] Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.682848 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-controller" containerID="cri-o://e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.682872 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="northd" containerID="cri-o://331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.683054 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="sbdb" containerID="cri-o://d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.683129 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="nbdb" containerID="cri-o://05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.683227 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-node" containerID="cri-o://496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.683333 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.683446 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-acl-logging" containerID="cri-o://c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" gracePeriod=30 Oct 09 13:59:54 crc kubenswrapper[4902]: I1009 13:59:54.732956 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovnkube-controller" containerID="cri-o://ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" gracePeriod=30 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.063956 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh6wc_4904c756-7ed4-4719-860f-c6f6458d002c/ovn-acl-logging/0.log" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.066287 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh6wc_4904c756-7ed4-4719-860f-c6f6458d002c/ovn-controller/0.log" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.066957 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126050 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8lmns"] Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126272 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kubecfg-setup" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126288 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kubecfg-setup" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126297 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="nbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126304 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="nbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126324 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="sbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126330 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="sbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126340 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="northd" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126345 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="northd" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126354 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126360 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126368 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-acl-logging" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126375 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-acl-logging" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126383 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126421 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126430 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-node" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126436 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-node" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.126445 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovnkube-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126451 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovnkube-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126550 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-node" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126563 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126573 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="northd" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126581 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="nbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126587 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovnkube-controller" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126596 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="sbdb" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126604 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="ovn-acl-logging" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.126611 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.128636 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.210025 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.210120 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.210846 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.210941 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket" (OuterVolumeSpecName: "log-socket") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.210978 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghml4\" (UniqueName: \"kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211019 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211046 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211082 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211106 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211136 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211160 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211195 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211214 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211234 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211263 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211303 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211301 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211351 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211384 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211465 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211488 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211546 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash" (OuterVolumeSpecName: "host-slash") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211518 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211579 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211611 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log" (OuterVolumeSpecName: "node-log") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211519 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211605 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211704 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211725 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211746 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211766 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch\") pod \"4904c756-7ed4-4719-860f-c6f6458d002c\" (UID: \"4904c756-7ed4-4719-860f-c6f6458d002c\") " Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211802 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211893 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.211995 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212051 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-kubelet\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212144 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-etc-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212249 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-systemd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212284 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212311 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212344 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-config\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212484 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212504 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-var-lib-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212521 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212603 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212676 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnz97\" (UniqueName: \"kubernetes.io/projected/89d08e85-df18-4c18-9816-986e948d319c-kube-api-access-qnz97\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212706 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-env-overrides\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212740 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212846 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-ovn\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212896 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-netns\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212936 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-netd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212958 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-bin\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.212989 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-systemd-units\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213011 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-slash\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213043 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-node-log\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213072 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-script-lib\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213095 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-log-socket\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213116 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89d08e85-df18-4c18-9816-986e948d319c-ovn-node-metrics-cert\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213229 4902 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213248 4902 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213261 4902 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213275 4902 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213287 4902 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213371 4902 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213398 4902 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213434 4902 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-log-socket\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213452 4902 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213468 4902 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213484 4902 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213499 4902 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213512 4902 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213524 4902 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4904c756-7ed4-4719-860f-c6f6458d002c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213540 4902 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213553 4902 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-host-slash\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.213566 4902 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-node-log\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.218070 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4" (OuterVolumeSpecName: "kube-api-access-ghml4") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "kube-api-access-ghml4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.218570 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.226620 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4904c756-7ed4-4719-860f-c6f6458d002c" (UID: "4904c756-7ed4-4719-860f-c6f6458d002c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314347 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314466 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-ovn\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314487 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-netns\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314508 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-netd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314525 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-bin\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314543 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-systemd-units\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314539 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-ovn\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314578 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-netd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314539 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314558 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-slash\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314613 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-slash\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314624 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-cni-bin\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314663 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-node-log\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314666 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-systemd-units\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314684 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-netns\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314698 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-node-log\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314714 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-script-lib\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314748 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89d08e85-df18-4c18-9816-986e948d319c-ovn-node-metrics-cert\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314765 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-log-socket\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314815 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-kubelet\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314871 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-etc-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314917 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314933 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-systemd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.314995 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-config\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315027 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-var-lib-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315064 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315111 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnz97\" (UniqueName: \"kubernetes.io/projected/89d08e85-df18-4c18-9816-986e948d319c-kube-api-access-qnz97\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315141 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-env-overrides\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315210 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghml4\" (UniqueName: \"kubernetes.io/projected/4904c756-7ed4-4719-860f-c6f6458d002c-kube-api-access-ghml4\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315222 4902 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4904c756-7ed4-4719-860f-c6f6458d002c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315234 4902 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4904c756-7ed4-4719-860f-c6f6458d002c-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315599 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-script-lib\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315648 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315854 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-env-overrides\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.315896 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-run-systemd\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316038 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-kubelet\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316081 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-log-socket\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316114 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316146 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-var-lib-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316177 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89d08e85-df18-4c18-9816-986e948d319c-etc-openvswitch\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.316430 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89d08e85-df18-4c18-9816-986e948d319c-ovnkube-config\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.319331 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89d08e85-df18-4c18-9816-986e948d319c-ovn-node-metrics-cert\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.339609 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnz97\" (UniqueName: \"kubernetes.io/projected/89d08e85-df18-4c18-9816-986e948d319c-kube-api-access-qnz97\") pod \"ovnkube-node-8lmns\" (UID: \"89d08e85-df18-4c18-9816-986e948d319c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.444312 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 13:59:55 crc kubenswrapper[4902]: W1009 13:59:55.468054 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d08e85_df18_4c18_9816_986e948d319c.slice/crio-2dce53dec3862b6e51bc5915fd9117a8265ca4734f495e862018861cd0d11d18 WatchSource:0}: Error finding container 2dce53dec3862b6e51bc5915fd9117a8265ca4734f495e862018861cd0d11d18: Status 404 returned error can't find the container with id 2dce53dec3862b6e51bc5915fd9117a8265ca4734f495e862018861cd0d11d18 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.704391 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh6wc_4904c756-7ed4-4719-860f-c6f6458d002c/ovn-acl-logging/0.log" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.705377 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh6wc_4904c756-7ed4-4719-860f-c6f6458d002c/ovn-controller/0.log" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.705953 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.705984 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.705998 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706010 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706021 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706029 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706037 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" exitCode=143 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706045 4902 generic.go:334] "Generic (PLEG): container finished" podID="4904c756-7ed4-4719-860f-c6f6458d002c" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" exitCode=143 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706094 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706112 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706155 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706172 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706188 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706201 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706213 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706228 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706240 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706246 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706255 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706267 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706271 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706275 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706385 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706400 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706421 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706429 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706436 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706442 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706449 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706469 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706491 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706499 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706505 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706510 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706515 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706521 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706525 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706531 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706536 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706544 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh6wc" event={"ID":"4904c756-7ed4-4719-860f-c6f6458d002c","Type":"ContainerDied","Data":"22c5291e39f4205f234f684ec8f36d4c77c7b493206767496f3d055781bc956c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706555 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706563 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706569 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706575 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706582 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706589 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706602 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706613 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.706620 4902 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.716618 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcz75_593745e8-10e2-486a-8a32-9e2dc766bc55/kube-multus/0.log" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.716676 4902 generic.go:334] "Generic (PLEG): container finished" podID="593745e8-10e2-486a-8a32-9e2dc766bc55" containerID="cf52d4820c410545c40099536788fd8a2655cb2aa1a2fbbb0d38c9afc19eb7b8" exitCode=2 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.716757 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcz75" event={"ID":"593745e8-10e2-486a-8a32-9e2dc766bc55","Type":"ContainerDied","Data":"cf52d4820c410545c40099536788fd8a2655cb2aa1a2fbbb0d38c9afc19eb7b8"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.718774 4902 scope.go:117] "RemoveContainer" containerID="cf52d4820c410545c40099536788fd8a2655cb2aa1a2fbbb0d38c9afc19eb7b8" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.719700 4902 generic.go:334] "Generic (PLEG): container finished" podID="89d08e85-df18-4c18-9816-986e948d319c" containerID="017eefacc18dda10f26b46346d649be6c9e33bddaf78ec481aa8578d66c4cf22" exitCode=0 Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.719744 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerDied","Data":"017eefacc18dda10f26b46346d649be6c9e33bddaf78ec481aa8578d66c4cf22"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.719780 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"2dce53dec3862b6e51bc5915fd9117a8265ca4734f495e862018861cd0d11d18"} Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.730635 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.733720 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh6wc"] Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.736464 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh6wc"] Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.776210 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.810887 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.830478 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.856779 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.878712 4902 scope.go:117] "RemoveContainer" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.899032 4902 scope.go:117] "RemoveContainer" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.925497 4902 scope.go:117] "RemoveContainer" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.956490 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.957040 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.957110 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} err="failed to get container status \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.957152 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.957997 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.958090 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} err="failed to get container status \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.958153 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.958709 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.958746 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} err="failed to get container status \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.958771 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.959093 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959115 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} err="failed to get container status \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959132 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.959463 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959514 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} err="failed to get container status \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959536 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.959814 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959837 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} err="failed to get container status \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.959852 4902 scope.go:117] "RemoveContainer" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.960476 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": container with ID starting with c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c not found: ID does not exist" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.960500 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} err="failed to get container status \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": rpc error: code = NotFound desc = could not find container \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": container with ID starting with c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.960514 4902 scope.go:117] "RemoveContainer" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.960893 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": container with ID starting with e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1 not found: ID does not exist" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.960916 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} err="failed to get container status \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": rpc error: code = NotFound desc = could not find container \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": container with ID starting with e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.960930 4902 scope.go:117] "RemoveContainer" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: E1009 13:59:55.961224 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": container with ID starting with db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9 not found: ID does not exist" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961249 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} err="failed to get container status \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": rpc error: code = NotFound desc = could not find container \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": container with ID starting with db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961265 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961652 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} err="failed to get container status \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961677 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961930 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} err="failed to get container status \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.961949 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.962658 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} err="failed to get container status \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.962677 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.963031 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} err="failed to get container status \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.963075 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.963651 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} err="failed to get container status \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.963679 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964087 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} err="failed to get container status \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964105 4902 scope.go:117] "RemoveContainer" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964687 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} err="failed to get container status \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": rpc error: code = NotFound desc = could not find container \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": container with ID starting with c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964704 4902 scope.go:117] "RemoveContainer" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964924 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} err="failed to get container status \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": rpc error: code = NotFound desc = could not find container \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": container with ID starting with e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.964944 4902 scope.go:117] "RemoveContainer" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965193 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} err="failed to get container status \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": rpc error: code = NotFound desc = could not find container \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": container with ID starting with db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965213 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965494 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} err="failed to get container status \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965509 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965770 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} err="failed to get container status \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965787 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965979 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} err="failed to get container status \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.965993 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966196 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} err="failed to get container status \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966210 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966390 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} err="failed to get container status \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966404 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966596 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} err="failed to get container status \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966611 4902 scope.go:117] "RemoveContainer" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966782 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} err="failed to get container status \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": rpc error: code = NotFound desc = could not find container \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": container with ID starting with c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966797 4902 scope.go:117] "RemoveContainer" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966960 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} err="failed to get container status \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": rpc error: code = NotFound desc = could not find container \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": container with ID starting with e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.966976 4902 scope.go:117] "RemoveContainer" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967153 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} err="failed to get container status \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": rpc error: code = NotFound desc = could not find container \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": container with ID starting with db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967168 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967332 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} err="failed to get container status \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967347 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967696 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} err="failed to get container status \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967711 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967902 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} err="failed to get container status \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.967917 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968082 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} err="failed to get container status \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968098 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968284 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} err="failed to get container status \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968299 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968467 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} err="failed to get container status \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968481 4902 scope.go:117] "RemoveContainer" containerID="c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968698 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c"} err="failed to get container status \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": rpc error: code = NotFound desc = could not find container \"c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c\": container with ID starting with c84c0afc9b8d3feb6cc4f449be2123c3de22a1c354c27ada7f8d6989ddff448c not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968718 4902 scope.go:117] "RemoveContainer" containerID="e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968978 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1"} err="failed to get container status \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": rpc error: code = NotFound desc = could not find container \"e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1\": container with ID starting with e7dceac177fd0f1f23655a025c88a72848f277f404a1ac7ffce5c8dbdc922de1 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.968995 4902 scope.go:117] "RemoveContainer" containerID="db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969233 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9"} err="failed to get container status \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": rpc error: code = NotFound desc = could not find container \"db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9\": container with ID starting with db95b45e4b1291b7db458562623863cad83f96f0dcdc3d2fc71ef5c2a79b1de9 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969251 4902 scope.go:117] "RemoveContainer" containerID="ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969525 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e"} err="failed to get container status \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": rpc error: code = NotFound desc = could not find container \"ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e\": container with ID starting with ce98c51532e074824b12143ed31d35add4f88838720d8f3c29c10fb3ab78623e not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969540 4902 scope.go:117] "RemoveContainer" containerID="d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969809 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18"} err="failed to get container status \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": rpc error: code = NotFound desc = could not find container \"d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18\": container with ID starting with d3246e7d90602adff59c9ed6d6697c6dcc044b4d6f83f510f3b7271aa3ef4f18 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.969831 4902 scope.go:117] "RemoveContainer" containerID="05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970097 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef"} err="failed to get container status \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": rpc error: code = NotFound desc = could not find container \"05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef\": container with ID starting with 05742a0a14e857f7e8a36d15f52011fc9a003130fec36d8aa5fff760e195baef not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970118 4902 scope.go:117] "RemoveContainer" containerID="331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970304 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548"} err="failed to get container status \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": rpc error: code = NotFound desc = could not find container \"331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548\": container with ID starting with 331340a1821401cb77b1cb646d09fbe6e4a533e124150119def24d4b49b43548 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970325 4902 scope.go:117] "RemoveContainer" containerID="37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970559 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0"} err="failed to get container status \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": rpc error: code = NotFound desc = could not find container \"37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0\": container with ID starting with 37a6498689268b5f29b030534ff3a670c9c396c8eaf55d5cb578c88de813cad0 not found: ID does not exist" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970580 4902 scope.go:117] "RemoveContainer" containerID="496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4" Oct 09 13:59:55 crc kubenswrapper[4902]: I1009 13:59:55.970837 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4"} err="failed to get container status \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": rpc error: code = NotFound desc = could not find container \"496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4\": container with ID starting with 496d93d979f54ca0f058a4b462e45adda124206a2f9fd0a72a86953ac090e8e4 not found: ID does not exist" Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.729110 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fcz75_593745e8-10e2-486a-8a32-9e2dc766bc55/kube-multus/0.log" Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.729495 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fcz75" event={"ID":"593745e8-10e2-486a-8a32-9e2dc766bc55","Type":"ContainerStarted","Data":"cbd3296ddcfcb3d386f8a515ac09e2e9288f2f26cfd7307bccde9ea44e4e2367"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733258 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"11d548e0b90ca984f4d6b2dad99e93b5dda1f486a5aa3ceea07958e6120b8b0b"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733302 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"48f7cbbce4cf7f56100f6f3fb95767e4863bd5a806431a272627dd96c6e7af6a"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733312 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"87fdb7d5946f506d576e091a22edd1326460b96357b09135ab274cb9508e996f"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733321 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"f9c498e4ecda2776fdb3b0c098ffb03ee86c8f1a34e66a89bc314275db4fe755"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733329 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"b5444495a4a80ad8f6211f9da4833647a1e6c7345102c83c108ee58c01b346e0"} Oct 09 13:59:56 crc kubenswrapper[4902]: I1009 13:59:56.733338 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"54aa58246ff168ddd26c74f74466fda19e297283474ebcef2c1e95082def4139"} Oct 09 13:59:57 crc kubenswrapper[4902]: I1009 13:59:57.522256 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4904c756-7ed4-4719-860f-c6f6458d002c" path="/var/lib/kubelet/pods/4904c756-7ed4-4719-860f-c6f6458d002c/volumes" Oct 09 13:59:58 crc kubenswrapper[4902]: I1009 13:59:58.748263 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"8a83c129d99228c59b547d402dfc4a9e35e61f904802ff8967756806d3ba0466"} Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.131950 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7"] Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.133478 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.136487 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.138026 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.279326 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rtc\" (UniqueName: \"kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.279561 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.279610 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.380828 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.380903 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.380981 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rtc\" (UniqueName: \"kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.382156 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.393613 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.400054 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rtc\" (UniqueName: \"kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc\") pod \"collect-profiles-29333640-dlvs7\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: I1009 14:00:00.450616 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: E1009 14:00:00.471357 4902 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(d22b72d203530ea240adcb22cc8cee0ab37a218dbf21edfad82730d5727a7d01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 14:00:00 crc kubenswrapper[4902]: E1009 14:00:00.471500 4902 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(d22b72d203530ea240adcb22cc8cee0ab37a218dbf21edfad82730d5727a7d01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: E1009 14:00:00.471535 4902 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(d22b72d203530ea240adcb22cc8cee0ab37a218dbf21edfad82730d5727a7d01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:00 crc kubenswrapper[4902]: E1009 14:00:00.471658 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager(10411e0c-6c14-4ede-9c44-e252a84a39cb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager(10411e0c-6c14-4ede-9c44-e252a84a39cb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(d22b72d203530ea240adcb22cc8cee0ab37a218dbf21edfad82730d5727a7d01): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.659885 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7"] Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.660581 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.661146 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:01 crc kubenswrapper[4902]: E1009 14:00:01.689803 4902 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(926171d005d11bb23131985513e2563dfe22892b52578494f3c07c9eb1d4c00e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 14:00:01 crc kubenswrapper[4902]: E1009 14:00:01.689877 4902 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(926171d005d11bb23131985513e2563dfe22892b52578494f3c07c9eb1d4c00e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:01 crc kubenswrapper[4902]: E1009 14:00:01.689903 4902 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(926171d005d11bb23131985513e2563dfe22892b52578494f3c07c9eb1d4c00e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:01 crc kubenswrapper[4902]: E1009 14:00:01.689957 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager(10411e0c-6c14-4ede-9c44-e252a84a39cb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager(10411e0c-6c14-4ede-9c44-e252a84a39cb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29333640-dlvs7_openshift-operator-lifecycle-manager_10411e0c-6c14-4ede-9c44-e252a84a39cb_0(926171d005d11bb23131985513e2563dfe22892b52578494f3c07c9eb1d4c00e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.773445 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" event={"ID":"89d08e85-df18-4c18-9816-986e948d319c","Type":"ContainerStarted","Data":"8bccbad4bc5432037029d84d8bcc1215178504d66bec43b687fa1a5969853624"} Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.773682 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.773749 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.773926 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.803277 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" podStartSLOduration=6.803259207 podStartE2EDuration="6.803259207s" podCreationTimestamp="2025-10-09 13:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:00:01.800747854 +0000 UTC m=+548.998606928" watchObservedRunningTime="2025-10-09 14:00:01.803259207 +0000 UTC m=+549.001118271" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.804138 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:01 crc kubenswrapper[4902]: I1009 14:00:01.805502 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:13 crc kubenswrapper[4902]: I1009 14:00:13.513036 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:13 crc kubenswrapper[4902]: I1009 14:00:13.518490 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:13 crc kubenswrapper[4902]: I1009 14:00:13.728897 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7"] Oct 09 14:00:13 crc kubenswrapper[4902]: W1009 14:00:13.734454 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10411e0c_6c14_4ede_9c44_e252a84a39cb.slice/crio-f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9 WatchSource:0}: Error finding container f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9: Status 404 returned error can't find the container with id f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9 Oct 09 14:00:13 crc kubenswrapper[4902]: I1009 14:00:13.841072 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" event={"ID":"10411e0c-6c14-4ede-9c44-e252a84a39cb","Type":"ContainerStarted","Data":"f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9"} Oct 09 14:00:14 crc kubenswrapper[4902]: I1009 14:00:14.850254 4902 generic.go:334] "Generic (PLEG): container finished" podID="10411e0c-6c14-4ede-9c44-e252a84a39cb" containerID="5e16d5daf85e63114be5f50fdb22be47d97d269e7269eff41d9878902d78f21f" exitCode=0 Oct 09 14:00:14 crc kubenswrapper[4902]: I1009 14:00:14.850559 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" event={"ID":"10411e0c-6c14-4ede-9c44-e252a84a39cb","Type":"ContainerDied","Data":"5e16d5daf85e63114be5f50fdb22be47d97d269e7269eff41d9878902d78f21f"} Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.097047 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.293712 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4rtc\" (UniqueName: \"kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc\") pod \"10411e0c-6c14-4ede-9c44-e252a84a39cb\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.293795 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume\") pod \"10411e0c-6c14-4ede-9c44-e252a84a39cb\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.293862 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume\") pod \"10411e0c-6c14-4ede-9c44-e252a84a39cb\" (UID: \"10411e0c-6c14-4ede-9c44-e252a84a39cb\") " Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.294696 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "10411e0c-6c14-4ede-9c44-e252a84a39cb" (UID: "10411e0c-6c14-4ede-9c44-e252a84a39cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.299059 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10411e0c-6c14-4ede-9c44-e252a84a39cb" (UID: "10411e0c-6c14-4ede-9c44-e252a84a39cb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.299091 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc" (OuterVolumeSpecName: "kube-api-access-j4rtc") pod "10411e0c-6c14-4ede-9c44-e252a84a39cb" (UID: "10411e0c-6c14-4ede-9c44-e252a84a39cb"). InnerVolumeSpecName "kube-api-access-j4rtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.395545 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10411e0c-6c14-4ede-9c44-e252a84a39cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.395816 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4rtc\" (UniqueName: \"kubernetes.io/projected/10411e0c-6c14-4ede-9c44-e252a84a39cb-kube-api-access-j4rtc\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.395830 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10411e0c-6c14-4ede-9c44-e252a84a39cb-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.865778 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" event={"ID":"10411e0c-6c14-4ede-9c44-e252a84a39cb","Type":"ContainerDied","Data":"f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9"} Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.865825 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d108d17ee516640132f3a74bc9763b358ef6d20e774798b1e7e645f10d8cb9" Oct 09 14:00:16 crc kubenswrapper[4902]: I1009 14:00:16.865900 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7" Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.077748 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.079062 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.079230 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.080098 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.080308 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b" gracePeriod=600 Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.889334 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b" exitCode=0 Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.889398 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b"} Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.889453 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd"} Oct 09 14:00:20 crc kubenswrapper[4902]: I1009 14:00:20.889474 4902 scope.go:117] "RemoveContainer" containerID="f09f58240f5e4802db6796459ff40ef0e937c5566e9e69ea5030c30651138876" Oct 09 14:00:25 crc kubenswrapper[4902]: I1009 14:00:25.469424 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lmns" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.455664 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n"] Oct 09 14:00:35 crc kubenswrapper[4902]: E1009 14:00:35.456615 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" containerName="collect-profiles" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.456629 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" containerName="collect-profiles" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.456727 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" containerName="collect-profiles" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.458747 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.460996 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.468873 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n"] Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.660968 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8hx9\" (UniqueName: \"kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.661025 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.661060 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.762175 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8hx9\" (UniqueName: \"kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.762223 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.762247 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.762795 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.763045 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.786979 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8hx9\" (UniqueName: \"kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.787491 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:35 crc kubenswrapper[4902]: I1009 14:00:35.996545 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n"] Oct 09 14:00:36 crc kubenswrapper[4902]: I1009 14:00:36.984904 4902 generic.go:334] "Generic (PLEG): container finished" podID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerID="e1cdd03b3cc8fd61d84e09a3a9e7b43b211a38c086116590189f4ea0d3963879" exitCode=0 Oct 09 14:00:36 crc kubenswrapper[4902]: I1009 14:00:36.984960 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" event={"ID":"3b4fc06f-f461-4486-83e7-4d153cc9ef10","Type":"ContainerDied","Data":"e1cdd03b3cc8fd61d84e09a3a9e7b43b211a38c086116590189f4ea0d3963879"} Oct 09 14:00:36 crc kubenswrapper[4902]: I1009 14:00:36.986264 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" event={"ID":"3b4fc06f-f461-4486-83e7-4d153cc9ef10","Type":"ContainerStarted","Data":"6dabb19d857144161ad663cd1f1af0156355044871208329358e0dd1b46e736b"} Oct 09 14:00:38 crc kubenswrapper[4902]: I1009 14:00:38.997866 4902 generic.go:334] "Generic (PLEG): container finished" podID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerID="835a7cc390105311ec9070fba8b11bf665925dba8520387f2f15419377613f89" exitCode=0 Oct 09 14:00:38 crc kubenswrapper[4902]: I1009 14:00:38.997904 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" event={"ID":"3b4fc06f-f461-4486-83e7-4d153cc9ef10","Type":"ContainerDied","Data":"835a7cc390105311ec9070fba8b11bf665925dba8520387f2f15419377613f89"} Oct 09 14:00:40 crc kubenswrapper[4902]: I1009 14:00:40.005764 4902 generic.go:334] "Generic (PLEG): container finished" podID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerID="7d6bcc0c28b3abcd1a167e1fc457c3c0bf8f403fe1e920c75ff2607d68e03a31" exitCode=0 Oct 09 14:00:40 crc kubenswrapper[4902]: I1009 14:00:40.005848 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" event={"ID":"3b4fc06f-f461-4486-83e7-4d153cc9ef10","Type":"ContainerDied","Data":"7d6bcc0c28b3abcd1a167e1fc457c3c0bf8f403fe1e920c75ff2607d68e03a31"} Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.274232 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.439163 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util\") pod \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.439285 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle\") pod \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.439357 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8hx9\" (UniqueName: \"kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9\") pod \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\" (UID: \"3b4fc06f-f461-4486-83e7-4d153cc9ef10\") " Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.440206 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle" (OuterVolumeSpecName: "bundle") pod "3b4fc06f-f461-4486-83e7-4d153cc9ef10" (UID: "3b4fc06f-f461-4486-83e7-4d153cc9ef10"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.447548 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9" (OuterVolumeSpecName: "kube-api-access-p8hx9") pod "3b4fc06f-f461-4486-83e7-4d153cc9ef10" (UID: "3b4fc06f-f461-4486-83e7-4d153cc9ef10"). InnerVolumeSpecName "kube-api-access-p8hx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.463757 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util" (OuterVolumeSpecName: "util") pod "3b4fc06f-f461-4486-83e7-4d153cc9ef10" (UID: "3b4fc06f-f461-4486-83e7-4d153cc9ef10"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.541503 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8hx9\" (UniqueName: \"kubernetes.io/projected/3b4fc06f-f461-4486-83e7-4d153cc9ef10-kube-api-access-p8hx9\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.541708 4902 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-util\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:41 crc kubenswrapper[4902]: I1009 14:00:41.541880 4902 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3b4fc06f-f461-4486-83e7-4d153cc9ef10-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:00:42 crc kubenswrapper[4902]: I1009 14:00:42.017777 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" event={"ID":"3b4fc06f-f461-4486-83e7-4d153cc9ef10","Type":"ContainerDied","Data":"6dabb19d857144161ad663cd1f1af0156355044871208329358e0dd1b46e736b"} Oct 09 14:00:42 crc kubenswrapper[4902]: I1009 14:00:42.017812 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dabb19d857144161ad663cd1f1af0156355044871208329358e0dd1b46e736b" Oct 09 14:00:42 crc kubenswrapper[4902]: I1009 14:00:42.017857 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.077664 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rt479"] Oct 09 14:00:43 crc kubenswrapper[4902]: E1009 14:00:43.077878 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="extract" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.077889 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="extract" Oct 09 14:00:43 crc kubenswrapper[4902]: E1009 14:00:43.077906 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="util" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.077912 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="util" Oct 09 14:00:43 crc kubenswrapper[4902]: E1009 14:00:43.077920 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="pull" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.077927 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="pull" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.078019 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4fc06f-f461-4486-83e7-4d153cc9ef10" containerName="extract" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.078400 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.080546 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.080771 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-f9xl7" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.081349 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.091037 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rt479"] Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.262622 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjst6\" (UniqueName: \"kubernetes.io/projected/e458690a-7a6b-4b1f-92e3-a93667bf1d60-kube-api-access-pjst6\") pod \"nmstate-operator-858ddd8f98-rt479\" (UID: \"e458690a-7a6b-4b1f-92e3-a93667bf1d60\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.363733 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjst6\" (UniqueName: \"kubernetes.io/projected/e458690a-7a6b-4b1f-92e3-a93667bf1d60-kube-api-access-pjst6\") pod \"nmstate-operator-858ddd8f98-rt479\" (UID: \"e458690a-7a6b-4b1f-92e3-a93667bf1d60\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.382616 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjst6\" (UniqueName: \"kubernetes.io/projected/e458690a-7a6b-4b1f-92e3-a93667bf1d60-kube-api-access-pjst6\") pod \"nmstate-operator-858ddd8f98-rt479\" (UID: \"e458690a-7a6b-4b1f-92e3-a93667bf1d60\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.394171 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" Oct 09 14:00:43 crc kubenswrapper[4902]: I1009 14:00:43.578729 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-rt479"] Oct 09 14:00:43 crc kubenswrapper[4902]: W1009 14:00:43.584555 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode458690a_7a6b_4b1f_92e3_a93667bf1d60.slice/crio-ee51315b5cc867b6188899f806b03d479d8fee58658fe44bd98308e36a7d3f95 WatchSource:0}: Error finding container ee51315b5cc867b6188899f806b03d479d8fee58658fe44bd98308e36a7d3f95: Status 404 returned error can't find the container with id ee51315b5cc867b6188899f806b03d479d8fee58658fe44bd98308e36a7d3f95 Oct 09 14:00:44 crc kubenswrapper[4902]: I1009 14:00:44.030209 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" event={"ID":"e458690a-7a6b-4b1f-92e3-a93667bf1d60","Type":"ContainerStarted","Data":"ee51315b5cc867b6188899f806b03d479d8fee58658fe44bd98308e36a7d3f95"} Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.055366 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" event={"ID":"e458690a-7a6b-4b1f-92e3-a93667bf1d60","Type":"ContainerStarted","Data":"ebfeddba2aa49ed1d7c50a97a41f16c19b9c7b12b9a2b4003b1b072beaf464cc"} Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.075953 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-rt479" podStartSLOduration=1.385340502 podStartE2EDuration="4.075930431s" podCreationTimestamp="2025-10-09 14:00:43 +0000 UTC" firstStartedPulling="2025-10-09 14:00:43.587346654 +0000 UTC m=+590.785205718" lastFinishedPulling="2025-10-09 14:00:46.277936583 +0000 UTC m=+593.475795647" observedRunningTime="2025-10-09 14:00:47.073256673 +0000 UTC m=+594.271115747" watchObservedRunningTime="2025-10-09 14:00:47.075930431 +0000 UTC m=+594.273789515" Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.974034 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7"] Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.975527 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.979469 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z9n82" Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.987482 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7"] Oct 09 14:00:47 crc kubenswrapper[4902]: I1009 14:00:47.999475 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.000228 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.002185 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.015747 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jddjt"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.016397 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.033058 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.039800 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9lk\" (UniqueName: \"kubernetes.io/projected/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-kube-api-access-wg9lk\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.039857 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-dbus-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.039885 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjqxq\" (UniqueName: \"kubernetes.io/projected/a526cd44-35b6-4800-bb53-fc7e1e6d96f8-kube-api-access-rjqxq\") pod \"nmstate-metrics-fdff9cb8d-t5rg7\" (UID: \"a526cd44-35b6-4800-bb53-fc7e1e6d96f8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.039922 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-ovs-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.039998 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sgqh\" (UniqueName: \"kubernetes.io/projected/fcca1450-5178-488f-8ba6-b290ea61a2fb-kube-api-access-8sgqh\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.040025 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-nmstate-lock\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.040066 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.141887 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9lk\" (UniqueName: \"kubernetes.io/projected/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-kube-api-access-wg9lk\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.141942 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-dbus-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.141972 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjqxq\" (UniqueName: \"kubernetes.io/projected/a526cd44-35b6-4800-bb53-fc7e1e6d96f8-kube-api-access-rjqxq\") pod \"nmstate-metrics-fdff9cb8d-t5rg7\" (UID: \"a526cd44-35b6-4800-bb53-fc7e1e6d96f8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.142003 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-ovs-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.142084 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sgqh\" (UniqueName: \"kubernetes.io/projected/fcca1450-5178-488f-8ba6-b290ea61a2fb-kube-api-access-8sgqh\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.142116 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-nmstate-lock\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.142157 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: E1009 14:00:48.142304 4902 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.146098 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-ovs-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.146786 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-dbus-socket\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.147136 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fcca1450-5178-488f-8ba6-b290ea61a2fb-nmstate-lock\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: E1009 14:00:48.151550 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair podName:47ba7105-e136-4d4e-8db2-5bb2edfb5a7b nodeName:}" failed. No retries permitted until 2025-10-09 14:00:48.642342573 +0000 UTC m=+595.840201637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair") pod "nmstate-webhook-6cdbc54649-45tq4" (UID: "47ba7105-e136-4d4e-8db2-5bb2edfb5a7b") : secret "openshift-nmstate-webhook" not found Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.153176 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.154079 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.156290 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.156303 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.156543 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-945zk" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.169513 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.189752 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9lk\" (UniqueName: \"kubernetes.io/projected/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-kube-api-access-wg9lk\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.211053 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sgqh\" (UniqueName: \"kubernetes.io/projected/fcca1450-5178-488f-8ba6-b290ea61a2fb-kube-api-access-8sgqh\") pod \"nmstate-handler-jddjt\" (UID: \"fcca1450-5178-488f-8ba6-b290ea61a2fb\") " pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.211761 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjqxq\" (UniqueName: \"kubernetes.io/projected/a526cd44-35b6-4800-bb53-fc7e1e6d96f8-kube-api-access-rjqxq\") pod \"nmstate-metrics-fdff9cb8d-t5rg7\" (UID: \"a526cd44-35b6-4800-bb53-fc7e1e6d96f8\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.243446 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m66xr\" (UniqueName: \"kubernetes.io/projected/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-kube-api-access-m66xr\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.243525 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.243625 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.293170 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.334434 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.341449 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-744679cbdb-vxd42"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.342399 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.344876 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m66xr\" (UniqueName: \"kubernetes.io/projected/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-kube-api-access-m66xr\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.344941 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.345003 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: E1009 14:00:48.345242 4902 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 09 14:00:48 crc kubenswrapper[4902]: E1009 14:00:48.345315 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert podName:b921f094-bf55-4b3e-8dd1-5f1d34a1336e nodeName:}" failed. No retries permitted until 2025-10-09 14:00:48.8452937 +0000 UTC m=+596.043152774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-whhgs" (UID: "b921f094-bf55-4b3e-8dd1-5f1d34a1336e") : secret "plugin-serving-cert" not found Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.346062 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.361006 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744679cbdb-vxd42"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.372946 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m66xr\" (UniqueName: \"kubernetes.io/projected/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-kube-api-access-m66xr\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446507 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-service-ca\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446561 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-trusted-ca-bundle\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446581 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-oauth-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446607 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446663 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-console-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.446698 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nv5\" (UniqueName: \"kubernetes.io/projected/dcd094ea-1794-48b1-9f26-bac42e70020f-kube-api-access-n7nv5\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.447096 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-oauth-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.548923 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-service-ca\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.548990 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-trusted-ca-bundle\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549022 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-oauth-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549076 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549133 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-console-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549226 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nv5\" (UniqueName: \"kubernetes.io/projected/dcd094ea-1794-48b1-9f26-bac42e70020f-kube-api-access-n7nv5\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549346 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-oauth-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.549929 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-service-ca\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.550188 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-trusted-ca-bundle\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.550351 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-oauth-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.551086 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcd094ea-1794-48b1-9f26-bac42e70020f-console-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.554615 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-oauth-config\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.554615 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd094ea-1794-48b1-9f26-bac42e70020f-console-serving-cert\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.576680 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nv5\" (UniqueName: \"kubernetes.io/projected/dcd094ea-1794-48b1-9f26-bac42e70020f-kube-api-access-n7nv5\") pod \"console-744679cbdb-vxd42\" (UID: \"dcd094ea-1794-48b1-9f26-bac42e70020f\") " pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.582737 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.650317 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.654067 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/47ba7105-e136-4d4e-8db2-5bb2edfb5a7b-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-45tq4\" (UID: \"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.665812 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.854277 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.857336 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744679cbdb-vxd42"] Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.858519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b921f094-bf55-4b3e-8dd1-5f1d34a1336e-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-whhgs\" (UID: \"b921f094-bf55-4b3e-8dd1-5f1d34a1336e\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:48 crc kubenswrapper[4902]: W1009 14:00:48.861835 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd094ea_1794_48b1_9f26_bac42e70020f.slice/crio-dfc1771cee17f39f94562a9abfb69eefb506a23311691862331634b15cff0ebe WatchSource:0}: Error finding container dfc1771cee17f39f94562a9abfb69eefb506a23311691862331634b15cff0ebe: Status 404 returned error can't find the container with id dfc1771cee17f39f94562a9abfb69eefb506a23311691862331634b15cff0ebe Oct 09 14:00:48 crc kubenswrapper[4902]: I1009 14:00:48.915979 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.079745 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.086100 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744679cbdb-vxd42" event={"ID":"dcd094ea-1794-48b1-9f26-bac42e70020f","Type":"ContainerStarted","Data":"96f6d4eed1075f3997c2675c1419f08680a0d3f0dad024ea7222957ea9e976de"} Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.086148 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744679cbdb-vxd42" event={"ID":"dcd094ea-1794-48b1-9f26-bac42e70020f","Type":"ContainerStarted","Data":"dfc1771cee17f39f94562a9abfb69eefb506a23311691862331634b15cff0ebe"} Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.094296 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jddjt" event={"ID":"fcca1450-5178-488f-8ba6-b290ea61a2fb","Type":"ContainerStarted","Data":"e518442587605504c8aa26ea4b15e5efaef9ac9c2159110370a9d58e480db7be"} Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.095555 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" event={"ID":"a526cd44-35b6-4800-bb53-fc7e1e6d96f8","Type":"ContainerStarted","Data":"7585c36bfb2e59644a087d872a76c3162400f40c6f610f95ee1ff38fefdcadb0"} Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.110160 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-744679cbdb-vxd42" podStartSLOduration=1.110099298 podStartE2EDuration="1.110099298s" podCreationTimestamp="2025-10-09 14:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:00:49.104536518 +0000 UTC m=+596.302395602" watchObservedRunningTime="2025-10-09 14:00:49.110099298 +0000 UTC m=+596.307958362" Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.124145 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4"] Oct 09 14:00:49 crc kubenswrapper[4902]: W1009 14:00:49.127781 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ba7105_e136_4d4e_8db2_5bb2edfb5a7b.slice/crio-1525616843f7948829d3741ef8ca6d66f699cec1d7ec90bedfa6e50669172136 WatchSource:0}: Error finding container 1525616843f7948829d3741ef8ca6d66f699cec1d7ec90bedfa6e50669172136: Status 404 returned error can't find the container with id 1525616843f7948829d3741ef8ca6d66f699cec1d7ec90bedfa6e50669172136 Oct 09 14:00:49 crc kubenswrapper[4902]: I1009 14:00:49.288128 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs"] Oct 09 14:00:49 crc kubenswrapper[4902]: W1009 14:00:49.313877 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb921f094_bf55_4b3e_8dd1_5f1d34a1336e.slice/crio-5ef78cd83f0e200a69ff450d7c4ea26f2f7b0acccea713f7d87ac7439c23d706 WatchSource:0}: Error finding container 5ef78cd83f0e200a69ff450d7c4ea26f2f7b0acccea713f7d87ac7439c23d706: Status 404 returned error can't find the container with id 5ef78cd83f0e200a69ff450d7c4ea26f2f7b0acccea713f7d87ac7439c23d706 Oct 09 14:00:50 crc kubenswrapper[4902]: I1009 14:00:50.100983 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" event={"ID":"b921f094-bf55-4b3e-8dd1-5f1d34a1336e","Type":"ContainerStarted","Data":"5ef78cd83f0e200a69ff450d7c4ea26f2f7b0acccea713f7d87ac7439c23d706"} Oct 09 14:00:50 crc kubenswrapper[4902]: I1009 14:00:50.101824 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" event={"ID":"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b","Type":"ContainerStarted","Data":"1525616843f7948829d3741ef8ca6d66f699cec1d7ec90bedfa6e50669172136"} Oct 09 14:00:52 crc kubenswrapper[4902]: I1009 14:00:52.114772 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" event={"ID":"a526cd44-35b6-4800-bb53-fc7e1e6d96f8","Type":"ContainerStarted","Data":"601ef281b932f403280a3d4a3ec6ac9a8fcef92869764f235f3ab4d83338b54f"} Oct 09 14:00:52 crc kubenswrapper[4902]: I1009 14:00:52.117623 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" event={"ID":"47ba7105-e136-4d4e-8db2-5bb2edfb5a7b","Type":"ContainerStarted","Data":"90bcb9e4eb7c71a8d6d4648c3391ecef1a007a6005b3a72f778ffb8ea0a044e7"} Oct 09 14:00:52 crc kubenswrapper[4902]: I1009 14:00:52.117815 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:00:52 crc kubenswrapper[4902]: I1009 14:00:52.140724 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" podStartSLOduration=3.047611561 podStartE2EDuration="5.14069539s" podCreationTimestamp="2025-10-09 14:00:47 +0000 UTC" firstStartedPulling="2025-10-09 14:00:49.130500527 +0000 UTC m=+596.328359591" lastFinishedPulling="2025-10-09 14:00:51.223584356 +0000 UTC m=+598.421443420" observedRunningTime="2025-10-09 14:00:52.140240227 +0000 UTC m=+599.338099291" watchObservedRunningTime="2025-10-09 14:00:52.14069539 +0000 UTC m=+599.338554464" Oct 09 14:00:53 crc kubenswrapper[4902]: I1009 14:00:53.127121 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" event={"ID":"b921f094-bf55-4b3e-8dd1-5f1d34a1336e","Type":"ContainerStarted","Data":"855d1d639aca0795bcbed44dc1664f7d504400224390da6fff67dd1c3f9fa25f"} Oct 09 14:00:53 crc kubenswrapper[4902]: I1009 14:00:53.158702 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-whhgs" podStartSLOduration=2.13348868 podStartE2EDuration="5.158484119s" podCreationTimestamp="2025-10-09 14:00:48 +0000 UTC" firstStartedPulling="2025-10-09 14:00:49.316799153 +0000 UTC m=+596.514658227" lastFinishedPulling="2025-10-09 14:00:52.341794602 +0000 UTC m=+599.539653666" observedRunningTime="2025-10-09 14:00:53.144361541 +0000 UTC m=+600.342220615" watchObservedRunningTime="2025-10-09 14:00:53.158484119 +0000 UTC m=+600.356343193" Oct 09 14:00:54 crc kubenswrapper[4902]: I1009 14:00:54.135940 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" event={"ID":"a526cd44-35b6-4800-bb53-fc7e1e6d96f8","Type":"ContainerStarted","Data":"0266d8cd03053bbdf9578136fd95463189f8bc541fa1185c1dad20e4b145e15c"} Oct 09 14:00:54 crc kubenswrapper[4902]: I1009 14:00:54.137477 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jddjt" event={"ID":"fcca1450-5178-488f-8ba6-b290ea61a2fb","Type":"ContainerStarted","Data":"b42ca2de115c54b0d17f550de541a047364ec8322b287e051ca2833439257d49"} Oct 09 14:00:54 crc kubenswrapper[4902]: I1009 14:00:54.153344 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-t5rg7" podStartSLOduration=1.897830253 podStartE2EDuration="7.153319426s" podCreationTimestamp="2025-10-09 14:00:47 +0000 UTC" firstStartedPulling="2025-10-09 14:00:48.593152371 +0000 UTC m=+595.791011435" lastFinishedPulling="2025-10-09 14:00:53.848641544 +0000 UTC m=+601.046500608" observedRunningTime="2025-10-09 14:00:54.152509073 +0000 UTC m=+601.350368167" watchObservedRunningTime="2025-10-09 14:00:54.153319426 +0000 UTC m=+601.351178490" Oct 09 14:00:54 crc kubenswrapper[4902]: I1009 14:00:54.188786 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jddjt" podStartSLOduration=2.581395829 podStartE2EDuration="7.188763949s" podCreationTimestamp="2025-10-09 14:00:47 +0000 UTC" firstStartedPulling="2025-10-09 14:00:48.395005684 +0000 UTC m=+595.592864748" lastFinishedPulling="2025-10-09 14:00:53.002373804 +0000 UTC m=+600.200232868" observedRunningTime="2025-10-09 14:00:54.183010413 +0000 UTC m=+601.380869487" watchObservedRunningTime="2025-10-09 14:00:54.188763949 +0000 UTC m=+601.386623023" Oct 09 14:00:55 crc kubenswrapper[4902]: I1009 14:00:55.144175 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:58 crc kubenswrapper[4902]: I1009 14:00:58.358720 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jddjt" Oct 09 14:00:58 crc kubenswrapper[4902]: I1009 14:00:58.666649 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:58 crc kubenswrapper[4902]: I1009 14:00:58.666709 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:58 crc kubenswrapper[4902]: I1009 14:00:58.673133 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:59 crc kubenswrapper[4902]: I1009 14:00:59.170781 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-744679cbdb-vxd42" Oct 09 14:00:59 crc kubenswrapper[4902]: I1009 14:00:59.220560 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 14:01:08 crc kubenswrapper[4902]: I1009 14:01:08.921337 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-45tq4" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.162457 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l"] Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.164255 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.166059 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.207726 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l"] Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.316952 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.317016 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg99c\" (UniqueName: \"kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.317192 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.418715 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.418807 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg99c\" (UniqueName: \"kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.418866 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.419472 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.419486 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.439228 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg99c\" (UniqueName: \"kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.479908 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:21 crc kubenswrapper[4902]: I1009 14:01:21.666709 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l"] Oct 09 14:01:22 crc kubenswrapper[4902]: I1009 14:01:22.302087 4902 generic.go:334] "Generic (PLEG): container finished" podID="13b857de-39a5-412b-b9a6-bd26a961d189" containerID="23d4ececf22cdd67ef26134eb8052d0d07da6097229b7f1dcb1c2a9f21200f27" exitCode=0 Oct 09 14:01:22 crc kubenswrapper[4902]: I1009 14:01:22.302180 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" event={"ID":"13b857de-39a5-412b-b9a6-bd26a961d189","Type":"ContainerDied","Data":"23d4ececf22cdd67ef26134eb8052d0d07da6097229b7f1dcb1c2a9f21200f27"} Oct 09 14:01:22 crc kubenswrapper[4902]: I1009 14:01:22.304146 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" event={"ID":"13b857de-39a5-412b-b9a6-bd26a961d189","Type":"ContainerStarted","Data":"8653d47a69671b521be5d980ca78b56e0ae920fc5e09360806df68e71df4dba0"} Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.263716 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-d5zks" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" containerID="cri-o://f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a" gracePeriod=15 Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.651892 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-d5zks_51ad1076-0ca9-4765-bd88-98f4cba434b6/console/0.log" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.652243 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771183 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pmxj\" (UniqueName: \"kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771292 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771315 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771348 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771373 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771394 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.771625 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config\") pod \"51ad1076-0ca9-4765-bd88-98f4cba434b6\" (UID: \"51ad1076-0ca9-4765-bd88-98f4cba434b6\") " Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.772471 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.772482 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca" (OuterVolumeSpecName: "service-ca") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.772574 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config" (OuterVolumeSpecName: "console-config") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.773294 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.773456 4902 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.773478 4902 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.773490 4902 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.773504 4902 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.777969 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.778369 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.778381 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj" (OuterVolumeSpecName: "kube-api-access-2pmxj") pod "51ad1076-0ca9-4765-bd88-98f4cba434b6" (UID: "51ad1076-0ca9-4765-bd88-98f4cba434b6"). InnerVolumeSpecName "kube-api-access-2pmxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.874390 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pmxj\" (UniqueName: \"kubernetes.io/projected/51ad1076-0ca9-4765-bd88-98f4cba434b6-kube-api-access-2pmxj\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.874475 4902 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:24 crc kubenswrapper[4902]: I1009 14:01:24.874486 4902 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/51ad1076-0ca9-4765-bd88-98f4cba434b6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.321445 4902 generic.go:334] "Generic (PLEG): container finished" podID="13b857de-39a5-412b-b9a6-bd26a961d189" containerID="2a4925894dcdaa6e535236193e84ca8ac015c106a4bd626639b8fefdedf6ff2b" exitCode=0 Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.321541 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" event={"ID":"13b857de-39a5-412b-b9a6-bd26a961d189","Type":"ContainerDied","Data":"2a4925894dcdaa6e535236193e84ca8ac015c106a4bd626639b8fefdedf6ff2b"} Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.331240 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-d5zks_51ad1076-0ca9-4765-bd88-98f4cba434b6/console/0.log" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.331301 4902 generic.go:334] "Generic (PLEG): container finished" podID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerID="f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a" exitCode=2 Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.331364 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d5zks" event={"ID":"51ad1076-0ca9-4765-bd88-98f4cba434b6","Type":"ContainerDied","Data":"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a"} Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.331391 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-d5zks" event={"ID":"51ad1076-0ca9-4765-bd88-98f4cba434b6","Type":"ContainerDied","Data":"ecd988aa7f75df92b186274f83a5f514320bfb0e65e232fd39ab3c7a139e5d48"} Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.331488 4902 scope.go:117] "RemoveContainer" containerID="f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.332536 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-d5zks" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.360044 4902 scope.go:117] "RemoveContainer" containerID="f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a" Oct 09 14:01:25 crc kubenswrapper[4902]: E1009 14:01:25.360477 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a\": container with ID starting with f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a not found: ID does not exist" containerID="f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.360532 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a"} err="failed to get container status \"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a\": rpc error: code = NotFound desc = could not find container \"f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a\": container with ID starting with f7b2863887f2c23890735d64819810c8b350e374dedc10c16a304713d0fd8e0a not found: ID does not exist" Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.368852 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.371661 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-d5zks"] Oct 09 14:01:25 crc kubenswrapper[4902]: I1009 14:01:25.520310 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" path="/var/lib/kubelet/pods/51ad1076-0ca9-4765-bd88-98f4cba434b6/volumes" Oct 09 14:01:26 crc kubenswrapper[4902]: I1009 14:01:26.339679 4902 generic.go:334] "Generic (PLEG): container finished" podID="13b857de-39a5-412b-b9a6-bd26a961d189" containerID="a9dfc3330ea6aff10685d2f098b1a87bbc800e2092c074468c6f8d89471cb262" exitCode=0 Oct 09 14:01:26 crc kubenswrapper[4902]: I1009 14:01:26.339736 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" event={"ID":"13b857de-39a5-412b-b9a6-bd26a961d189","Type":"ContainerDied","Data":"a9dfc3330ea6aff10685d2f098b1a87bbc800e2092c074468c6f8d89471cb262"} Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.643014 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.811450 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg99c\" (UniqueName: \"kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c\") pod \"13b857de-39a5-412b-b9a6-bd26a961d189\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.811519 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util\") pod \"13b857de-39a5-412b-b9a6-bd26a961d189\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.811629 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle\") pod \"13b857de-39a5-412b-b9a6-bd26a961d189\" (UID: \"13b857de-39a5-412b-b9a6-bd26a961d189\") " Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.813048 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle" (OuterVolumeSpecName: "bundle") pod "13b857de-39a5-412b-b9a6-bd26a961d189" (UID: "13b857de-39a5-412b-b9a6-bd26a961d189"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.818330 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c" (OuterVolumeSpecName: "kube-api-access-sg99c") pod "13b857de-39a5-412b-b9a6-bd26a961d189" (UID: "13b857de-39a5-412b-b9a6-bd26a961d189"). InnerVolumeSpecName "kube-api-access-sg99c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.823175 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util" (OuterVolumeSpecName: "util") pod "13b857de-39a5-412b-b9a6-bd26a961d189" (UID: "13b857de-39a5-412b-b9a6-bd26a961d189"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.913497 4902 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-util\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.913550 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg99c\" (UniqueName: \"kubernetes.io/projected/13b857de-39a5-412b-b9a6-bd26a961d189-kube-api-access-sg99c\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:27 crc kubenswrapper[4902]: I1009 14:01:27.913564 4902 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13b857de-39a5-412b-b9a6-bd26a961d189-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:28 crc kubenswrapper[4902]: I1009 14:01:28.413533 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" event={"ID":"13b857de-39a5-412b-b9a6-bd26a961d189","Type":"ContainerDied","Data":"8653d47a69671b521be5d980ca78b56e0ae920fc5e09360806df68e71df4dba0"} Oct 09 14:01:28 crc kubenswrapper[4902]: I1009 14:01:28.413868 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8653d47a69671b521be5d980ca78b56e0ae920fc5e09360806df68e71df4dba0" Oct 09 14:01:28 crc kubenswrapper[4902]: I1009 14:01:28.413689 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l" Oct 09 14:01:36 crc kubenswrapper[4902]: I1009 14:01:36.852030 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 14:01:36 crc kubenswrapper[4902]: I1009 14:01:36.852563 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" containerID="cri-o://884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49" gracePeriod=30 Oct 09 14:01:36 crc kubenswrapper[4902]: I1009 14:01:36.950256 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 14:01:36 crc kubenswrapper[4902]: I1009 14:01:36.950761 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerName="route-controller-manager" containerID="cri-o://8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7" gracePeriod=30 Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.316713 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.335885 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert\") pod \"b43c1099-b997-4be7-8390-a379e0dc5541\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.350109 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b43c1099-b997-4be7-8390-a379e0dc5541" (UID: "b43c1099-b997-4be7-8390-a379e0dc5541"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.393810 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns"] Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.394101 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="util" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394122 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="util" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.394136 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="pull" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394144 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="pull" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.394157 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="extract" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394165 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="extract" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.394177 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394184 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.394195 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394208 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394339 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b857de-39a5-412b-b9a6-bd26a961d189" containerName="extract" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394354 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ad1076-0ca9-4765-bd88-98f4cba434b6" containerName="console" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394368 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" containerName="controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.394868 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.397989 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.398059 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.398164 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.400746 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-55mn9" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.400915 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.410833 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437267 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca\") pod \"fd1d9312-7008-48ff-9437-af995ef9b88d\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437335 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert\") pod \"fd1d9312-7008-48ff-9437-af995ef9b88d\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437360 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca\") pod \"b43c1099-b997-4be7-8390-a379e0dc5541\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437392 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dglzr\" (UniqueName: \"kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr\") pod \"fd1d9312-7008-48ff-9437-af995ef9b88d\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437434 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config\") pod \"fd1d9312-7008-48ff-9437-af995ef9b88d\" (UID: \"fd1d9312-7008-48ff-9437-af995ef9b88d\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437472 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles\") pod \"b43c1099-b997-4be7-8390-a379e0dc5541\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437501 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7445\" (UniqueName: \"kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445\") pod \"b43c1099-b997-4be7-8390-a379e0dc5541\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437536 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config\") pod \"b43c1099-b997-4be7-8390-a379e0dc5541\" (UID: \"b43c1099-b997-4be7-8390-a379e0dc5541\") " Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437630 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hfnv\" (UniqueName: \"kubernetes.io/projected/04adaa94-05f3-4989-b5fa-a057f556aa56-kube-api-access-6hfnv\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437678 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-webhook-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437735 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-apiservice-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.437800 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43c1099-b997-4be7-8390-a379e0dc5541-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.439036 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config" (OuterVolumeSpecName: "config") pod "fd1d9312-7008-48ff-9437-af995ef9b88d" (UID: "fd1d9312-7008-48ff-9437-af995ef9b88d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.439294 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config" (OuterVolumeSpecName: "config") pod "b43c1099-b997-4be7-8390-a379e0dc5541" (UID: "b43c1099-b997-4be7-8390-a379e0dc5541"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.439438 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca" (OuterVolumeSpecName: "client-ca") pod "b43c1099-b997-4be7-8390-a379e0dc5541" (UID: "b43c1099-b997-4be7-8390-a379e0dc5541"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.439501 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b43c1099-b997-4be7-8390-a379e0dc5541" (UID: "b43c1099-b997-4be7-8390-a379e0dc5541"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.439886 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca" (OuterVolumeSpecName: "client-ca") pod "fd1d9312-7008-48ff-9437-af995ef9b88d" (UID: "fd1d9312-7008-48ff-9437-af995ef9b88d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.452373 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fd1d9312-7008-48ff-9437-af995ef9b88d" (UID: "fd1d9312-7008-48ff-9437-af995ef9b88d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.452451 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr" (OuterVolumeSpecName: "kube-api-access-dglzr") pod "fd1d9312-7008-48ff-9437-af995ef9b88d" (UID: "fd1d9312-7008-48ff-9437-af995ef9b88d"). InnerVolumeSpecName "kube-api-access-dglzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.452537 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445" (OuterVolumeSpecName: "kube-api-access-x7445") pod "b43c1099-b997-4be7-8390-a379e0dc5541" (UID: "b43c1099-b997-4be7-8390-a379e0dc5541"). InnerVolumeSpecName "kube-api-access-x7445". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.468511 4902 generic.go:334] "Generic (PLEG): container finished" podID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerID="8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7" exitCode=0 Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.468586 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" event={"ID":"fd1d9312-7008-48ff-9437-af995ef9b88d","Type":"ContainerDied","Data":"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7"} Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.468617 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" event={"ID":"fd1d9312-7008-48ff-9437-af995ef9b88d","Type":"ContainerDied","Data":"e29a77e9c3f9c038472bd75fbbb248c31c1f4ed1dee2f5114ea639bc3f6353e0"} Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.468635 4902 scope.go:117] "RemoveContainer" containerID="8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.468760 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.474699 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.489289 4902 generic.go:334] "Generic (PLEG): container finished" podID="b43c1099-b997-4be7-8390-a379e0dc5541" containerID="884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49" exitCode=0 Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.489342 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" event={"ID":"b43c1099-b997-4be7-8390-a379e0dc5541","Type":"ContainerDied","Data":"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49"} Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.489373 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" event={"ID":"b43c1099-b997-4be7-8390-a379e0dc5541","Type":"ContainerDied","Data":"0e96b9083f5147e79fa7f1872f3a9277e8731145d68738a744a6b4e4f50cbc02"} Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.489473 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6rzkc" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.531738 4902 scope.go:117] "RemoveContainer" containerID="8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.534374 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7\": container with ID starting with 8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7 not found: ID does not exist" containerID="8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.534443 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7"} err="failed to get container status \"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7\": rpc error: code = NotFound desc = could not find container \"8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7\": container with ID starting with 8f7c06a18eb1809ce43b08167a468da978d5ba38c4227208e8096aab8f98a8f7 not found: ID does not exist" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.534476 4902 scope.go:117] "RemoveContainer" containerID="884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538535 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hfnv\" (UniqueName: \"kubernetes.io/projected/04adaa94-05f3-4989-b5fa-a057f556aa56-kube-api-access-6hfnv\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538583 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-webhook-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538648 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-apiservice-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538717 4902 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538732 4902 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1d9312-7008-48ff-9437-af995ef9b88d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538743 4902 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538755 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dglzr\" (UniqueName: \"kubernetes.io/projected/fd1d9312-7008-48ff-9437-af995ef9b88d-kube-api-access-dglzr\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538771 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1d9312-7008-48ff-9437-af995ef9b88d-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538782 4902 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538795 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7445\" (UniqueName: \"kubernetes.io/projected/b43c1099-b997-4be7-8390-a379e0dc5541-kube-api-access-x7445\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.538811 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43c1099-b997-4be7-8390-a379e0dc5541-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.547934 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-apiservice-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.550458 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04adaa94-05f3-4989-b5fa-a057f556aa56-webhook-cert\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.551726 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.556811 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ttk5q"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.563071 4902 scope.go:117] "RemoveContainer" containerID="884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49" Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.563712 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49\": container with ID starting with 884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49 not found: ID does not exist" containerID="884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.563763 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49"} err="failed to get container status \"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49\": rpc error: code = NotFound desc = could not find container \"884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49\": container with ID starting with 884a23e1af7e9f186b35ecc934e438f38ca8a4bf640c2107bd94a46f6169dd49 not found: ID does not exist" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.568200 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hfnv\" (UniqueName: \"kubernetes.io/projected/04adaa94-05f3-4989-b5fa-a057f556aa56-kube-api-access-6hfnv\") pod \"metallb-operator-controller-manager-69f9c58987-qjtns\" (UID: \"04adaa94-05f3-4989-b5fa-a057f556aa56\") " pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.573292 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.577690 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6rzkc"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.720947 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.862833 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj"] Oct 09 14:01:37 crc kubenswrapper[4902]: E1009 14:01:37.863049 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerName="route-controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.863060 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerName="route-controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.863174 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" containerName="route-controller-manager" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.863585 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.865755 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-x67b8" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.866033 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.866152 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.902468 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj"] Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.947313 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jb7w\" (UniqueName: \"kubernetes.io/projected/17bd8034-bc7c-4eaa-9f47-74ca097940bd-kube-api-access-7jb7w\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.947366 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:37 crc kubenswrapper[4902]: I1009 14:01:37.947433 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-webhook-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.048324 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jb7w\" (UniqueName: \"kubernetes.io/projected/17bd8034-bc7c-4eaa-9f47-74ca097940bd-kube-api-access-7jb7w\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.048396 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.048478 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-webhook-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.056160 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-webhook-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.056654 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17bd8034-bc7c-4eaa-9f47-74ca097940bd-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.068605 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jb7w\" (UniqueName: \"kubernetes.io/projected/17bd8034-bc7c-4eaa-9f47-74ca097940bd-kube-api-access-7jb7w\") pod \"metallb-operator-webhook-server-56b4cd547-vqwzj\" (UID: \"17bd8034-bc7c-4eaa-9f47-74ca097940bd\") " pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.091510 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68645bd78b-6c2v6"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.092252 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.096615 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.097044 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.097287 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.097445 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.100182 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.100742 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.105263 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.116385 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68645bd78b-6c2v6"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.149770 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.150683 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-proxy-ca-bundles\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.150734 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwf2n\" (UniqueName: \"kubernetes.io/projected/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-kube-api-access-wwf2n\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.150791 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-config\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.150820 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-serving-cert\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.150877 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-client-ca\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.151063 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.158080 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.158107 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.158268 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.158676 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.158827 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.159117 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.160116 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.248965 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252460 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-config\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252518 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-config\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252557 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-serving-cert\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252613 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-client-ca\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252677 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-kube-api-access-fbtkh\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252708 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-client-ca\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252759 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-proxy-ca-bundles\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252783 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-serving-cert\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.252819 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwf2n\" (UniqueName: \"kubernetes.io/projected/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-kube-api-access-wwf2n\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.253699 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-client-ca\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.253872 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-proxy-ca-bundles\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.254016 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-config\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.257331 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-serving-cert\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.273378 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.276614 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwf2n\" (UniqueName: \"kubernetes.io/projected/c5ed40fb-7451-4cf2-8c84-00f2cce67b1f-kube-api-access-wwf2n\") pod \"controller-manager-68645bd78b-6c2v6\" (UID: \"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f\") " pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.354204 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-kube-api-access-fbtkh\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.354290 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-serving-cert\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.354350 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-config\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.354423 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-client-ca\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.355519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-client-ca\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.356232 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-config\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.358227 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-serving-cert\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.384564 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtkh\" (UniqueName: \"kubernetes.io/projected/e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4-kube-api-access-fbtkh\") pod \"route-controller-manager-5d4d44d456-jnhdr\" (UID: \"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4\") " pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.413229 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.475839 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.506052 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" event={"ID":"04adaa94-05f3-4989-b5fa-a057f556aa56","Type":"ContainerStarted","Data":"caa4d8e328c75f7234d23c229a14ae2d06ad418ee11b91b738577617bd73bebf"} Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.521123 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj"] Oct 09 14:01:38 crc kubenswrapper[4902]: W1009 14:01:38.529699 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17bd8034_bc7c_4eaa_9f47_74ca097940bd.slice/crio-d489500b3afabc31cac1269484c50555d4171eb39c9bab3c0d4ade5070ca8cd9 WatchSource:0}: Error finding container d489500b3afabc31cac1269484c50555d4171eb39c9bab3c0d4ade5070ca8cd9: Status 404 returned error can't find the container with id d489500b3afabc31cac1269484c50555d4171eb39c9bab3c0d4ade5070ca8cd9 Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.694702 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68645bd78b-6c2v6"] Oct 09 14:01:38 crc kubenswrapper[4902]: I1009 14:01:38.764919 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr"] Oct 09 14:01:38 crc kubenswrapper[4902]: W1009 14:01:38.772012 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode098dc0b_3baf_4fc8_a84e_5e1dd8237dd4.slice/crio-8c95f2c37286413ce0d4a685c5189b7f30135e116737d3f7c2859aa8a39a1241 WatchSource:0}: Error finding container 8c95f2c37286413ce0d4a685c5189b7f30135e116737d3f7c2859aa8a39a1241: Status 404 returned error can't find the container with id 8c95f2c37286413ce0d4a685c5189b7f30135e116737d3f7c2859aa8a39a1241 Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.522403 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43c1099-b997-4be7-8390-a379e0dc5541" path="/var/lib/kubelet/pods/b43c1099-b997-4be7-8390-a379e0dc5541/volumes" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.523576 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd1d9312-7008-48ff-9437-af995ef9b88d" path="/var/lib/kubelet/pods/fd1d9312-7008-48ff-9437-af995ef9b88d/volumes" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524099 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524134 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" event={"ID":"17bd8034-bc7c-4eaa-9f47-74ca097940bd","Type":"ContainerStarted","Data":"d489500b3afabc31cac1269484c50555d4171eb39c9bab3c0d4ade5070ca8cd9"} Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524156 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524170 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" event={"ID":"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4","Type":"ContainerStarted","Data":"3b83328d568bbec65c266668dde124dc076c4520ad7b63cd5b2f2a59429ef855"} Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524184 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" event={"ID":"e098dc0b-3baf-4fc8-a84e-5e1dd8237dd4","Type":"ContainerStarted","Data":"8c95f2c37286413ce0d4a685c5189b7f30135e116737d3f7c2859aa8a39a1241"} Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524196 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" event={"ID":"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f","Type":"ContainerStarted","Data":"6fd66974c7e2429dfb0425ce566275bca602fb2c77832f87cd6a48b1d3bafb58"} Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524234 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.524248 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" event={"ID":"c5ed40fb-7451-4cf2-8c84-00f2cce67b1f","Type":"ContainerStarted","Data":"9ef184bf8e1dfd3fa8227b0bfc23ae4122e2358222f6d7b48d40cba6b4863a12"} Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.525538 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.561173 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d4d44d456-jnhdr" podStartSLOduration=1.56115553 podStartE2EDuration="1.56115553s" podCreationTimestamp="2025-10-09 14:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:01:39.534190832 +0000 UTC m=+646.732049916" watchObservedRunningTime="2025-10-09 14:01:39.56115553 +0000 UTC m=+646.759014594" Oct 09 14:01:39 crc kubenswrapper[4902]: I1009 14:01:39.562389 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68645bd78b-6c2v6" podStartSLOduration=1.562379505 podStartE2EDuration="1.562379505s" podCreationTimestamp="2025-10-09 14:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:01:39.560453479 +0000 UTC m=+646.758312563" watchObservedRunningTime="2025-10-09 14:01:39.562379505 +0000 UTC m=+646.760238579" Oct 09 14:01:42 crc kubenswrapper[4902]: I1009 14:01:42.109917 4902 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 14:01:44 crc kubenswrapper[4902]: I1009 14:01:44.552733 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" event={"ID":"17bd8034-bc7c-4eaa-9f47-74ca097940bd","Type":"ContainerStarted","Data":"4f0d42edd9a5e66cbcb58343e3d65bc217936795a7ba73ef6d434dcc2a8ea944"} Oct 09 14:01:44 crc kubenswrapper[4902]: I1009 14:01:44.555617 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" event={"ID":"04adaa94-05f3-4989-b5fa-a057f556aa56","Type":"ContainerStarted","Data":"098d004d96ada0107949ba90b545f125cdb1085d3b28afea573bd67013bcd0b1"} Oct 09 14:01:44 crc kubenswrapper[4902]: I1009 14:01:44.555736 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:44 crc kubenswrapper[4902]: I1009 14:01:44.555827 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:01:44 crc kubenswrapper[4902]: I1009 14:01:44.573661 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" podStartSLOduration=2.251611579 podStartE2EDuration="7.573637101s" podCreationTimestamp="2025-10-09 14:01:37 +0000 UTC" firstStartedPulling="2025-10-09 14:01:38.537792 +0000 UTC m=+645.735651064" lastFinishedPulling="2025-10-09 14:01:43.859817522 +0000 UTC m=+651.057676586" observedRunningTime="2025-10-09 14:01:44.570786838 +0000 UTC m=+651.768645902" watchObservedRunningTime="2025-10-09 14:01:44.573637101 +0000 UTC m=+651.771496165" Oct 09 14:01:58 crc kubenswrapper[4902]: I1009 14:01:58.254262 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56b4cd547-vqwzj" Oct 09 14:01:58 crc kubenswrapper[4902]: I1009 14:01:58.272922 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" podStartSLOduration=15.730165824 podStartE2EDuration="21.272903795s" podCreationTimestamp="2025-10-09 14:01:37 +0000 UTC" firstStartedPulling="2025-10-09 14:01:38.282132583 +0000 UTC m=+645.479991647" lastFinishedPulling="2025-10-09 14:01:43.824870554 +0000 UTC m=+651.022729618" observedRunningTime="2025-10-09 14:01:44.60895817 +0000 UTC m=+651.806817254" watchObservedRunningTime="2025-10-09 14:01:58.272903795 +0000 UTC m=+665.470762859" Oct 09 14:02:17 crc kubenswrapper[4902]: I1009 14:02:17.724142 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-69f9c58987-qjtns" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.557191 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-sw9n6"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.559977 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.562351 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tss6n" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.562384 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.563541 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.564207 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.566067 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.570277 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.580850 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.673458 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pbqrp"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.674818 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.678035 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-m47st"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.679183 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.680072 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.681183 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.681550 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xm587" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.681707 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.681865 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.686042 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-m47st"] Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.695336 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-sockets\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.695642 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-metrics\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.695794 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/693846cb-0606-4818-b246-e6940fa26802-frr-startup\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.695933 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-conf\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.696043 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-reloader\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.696145 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltpwx\" (UniqueName: \"kubernetes.io/projected/693846cb-0606-4818-b246-e6940fa26802-kube-api-access-ltpwx\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.696246 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c788e9d2-cc9c-4dd8-b65d-f422358e0510-cert\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.696338 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffpb\" (UniqueName: \"kubernetes.io/projected/c788e9d2-cc9c-4dd8-b65d-f422358e0510-kube-api-access-tffpb\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.696525 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/693846cb-0606-4818-b246-e6940fa26802-metrics-certs\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797721 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5l4s\" (UniqueName: \"kubernetes.io/projected/bf00f4f5-2086-46a3-b460-f55dd00e2507-kube-api-access-v5l4s\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797803 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/693846cb-0606-4818-b246-e6940fa26802-frr-startup\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797835 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-conf\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797871 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-reloader\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797895 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltpwx\" (UniqueName: \"kubernetes.io/projected/693846cb-0606-4818-b246-e6940fa26802-kube-api-access-ltpwx\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797936 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797960 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c788e9d2-cc9c-4dd8-b65d-f422358e0510-cert\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.797976 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffpb\" (UniqueName: \"kubernetes.io/projected/c788e9d2-cc9c-4dd8-b65d-f422358e0510-kube-api-access-tffpb\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798023 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-cert\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798061 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-metrics-certs\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798094 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/693846cb-0606-4818-b246-e6940fa26802-metrics-certs\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798111 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jvlk\" (UniqueName: \"kubernetes.io/projected/b400d066-a3bb-4b85-aaa1-7ddca808de2e-kube-api-access-7jvlk\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798127 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-sockets\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798144 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-metrics\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798195 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metallb-excludel2\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.798216 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metrics-certs\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.799281 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/693846cb-0606-4818-b246-e6940fa26802-frr-startup\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.800429 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-conf\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.800499 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-reloader\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.800581 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-metrics\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.800603 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/693846cb-0606-4818-b246-e6940fa26802-frr-sockets\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.804120 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/693846cb-0606-4818-b246-e6940fa26802-metrics-certs\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.807019 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c788e9d2-cc9c-4dd8-b65d-f422358e0510-cert\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.817142 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffpb\" (UniqueName: \"kubernetes.io/projected/c788e9d2-cc9c-4dd8-b65d-f422358e0510-kube-api-access-tffpb\") pod \"frr-k8s-webhook-server-64bf5d555-v4t7k\" (UID: \"c788e9d2-cc9c-4dd8-b65d-f422358e0510\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.821431 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltpwx\" (UniqueName: \"kubernetes.io/projected/693846cb-0606-4818-b246-e6940fa26802-kube-api-access-ltpwx\") pod \"frr-k8s-sw9n6\" (UID: \"693846cb-0606-4818-b246-e6940fa26802\") " pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.886075 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899532 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-metrics-certs\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899591 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jvlk\" (UniqueName: \"kubernetes.io/projected/b400d066-a3bb-4b85-aaa1-7ddca808de2e-kube-api-access-7jvlk\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899657 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metallb-excludel2\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899678 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metrics-certs\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899715 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5l4s\" (UniqueName: \"kubernetes.io/projected/bf00f4f5-2086-46a3-b460-f55dd00e2507-kube-api-access-v5l4s\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899756 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.899824 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-cert\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: E1009 14:02:18.902697 4902 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 14:02:18 crc kubenswrapper[4902]: E1009 14:02:18.902774 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist podName:b400d066-a3bb-4b85-aaa1-7ddca808de2e nodeName:}" failed. No retries permitted until 2025-10-09 14:02:19.402745967 +0000 UTC m=+686.600605031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist") pod "speaker-pbqrp" (UID: "b400d066-a3bb-4b85-aaa1-7ddca808de2e") : secret "metallb-memberlist" not found Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.902769 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.903478 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metallb-excludel2\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.904847 4902 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.905092 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-metrics-certs\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.906188 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-metrics-certs\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.914061 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf00f4f5-2086-46a3-b460-f55dd00e2507-cert\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.922167 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jvlk\" (UniqueName: \"kubernetes.io/projected/b400d066-a3bb-4b85-aaa1-7ddca808de2e-kube-api-access-7jvlk\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:18 crc kubenswrapper[4902]: I1009 14:02:18.922807 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5l4s\" (UniqueName: \"kubernetes.io/projected/bf00f4f5-2086-46a3-b460-f55dd00e2507-kube-api-access-v5l4s\") pod \"controller-68d546b9d8-m47st\" (UID: \"bf00f4f5-2086-46a3-b460-f55dd00e2507\") " pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.016129 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.342821 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k"] Oct 09 14:02:19 crc kubenswrapper[4902]: W1009 14:02:19.346023 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc788e9d2_cc9c_4dd8_b65d_f422358e0510.slice/crio-0a834e0f2b6abcf057afe5383d1d4de4908006e133152dcc2415bae8d97e0a79 WatchSource:0}: Error finding container 0a834e0f2b6abcf057afe5383d1d4de4908006e133152dcc2415bae8d97e0a79: Status 404 returned error can't find the container with id 0a834e0f2b6abcf057afe5383d1d4de4908006e133152dcc2415bae8d97e0a79 Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.407197 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:19 crc kubenswrapper[4902]: E1009 14:02:19.407428 4902 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 14:02:19 crc kubenswrapper[4902]: E1009 14:02:19.407506 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist podName:b400d066-a3bb-4b85-aaa1-7ddca808de2e nodeName:}" failed. No retries permitted until 2025-10-09 14:02:20.407485859 +0000 UTC m=+687.605344923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist") pod "speaker-pbqrp" (UID: "b400d066-a3bb-4b85-aaa1-7ddca808de2e") : secret "metallb-memberlist" not found Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.424377 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-m47st"] Oct 09 14:02:19 crc kubenswrapper[4902]: W1009 14:02:19.428063 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf00f4f5_2086_46a3_b460_f55dd00e2507.slice/crio-b5fcfc1186d1f2161595fbcaee85f08a99d5d5d428613c653bbe80379697c9b6 WatchSource:0}: Error finding container b5fcfc1186d1f2161595fbcaee85f08a99d5d5d428613c653bbe80379697c9b6: Status 404 returned error can't find the container with id b5fcfc1186d1f2161595fbcaee85f08a99d5d5d428613c653bbe80379697c9b6 Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.756450 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"7eb6ad29d70c89f79500929be2562f6ba3c2f7bf8bb1bd24847fa2169dae6bd2"} Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.759438 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" event={"ID":"c788e9d2-cc9c-4dd8-b65d-f422358e0510","Type":"ContainerStarted","Data":"0a834e0f2b6abcf057afe5383d1d4de4908006e133152dcc2415bae8d97e0a79"} Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.761034 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-m47st" event={"ID":"bf00f4f5-2086-46a3-b460-f55dd00e2507","Type":"ContainerStarted","Data":"6b246595274d3732065d84b059ebeb28cbcff0e9c783be78541f4add39ebf9c5"} Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.761075 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-m47st" event={"ID":"bf00f4f5-2086-46a3-b460-f55dd00e2507","Type":"ContainerStarted","Data":"a28900ba3934d4aadeff809730feac1d1e91f2ea7660c9f2cc2ea71bcd4304c2"} Oct 09 14:02:19 crc kubenswrapper[4902]: I1009 14:02:19.761091 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-m47st" event={"ID":"bf00f4f5-2086-46a3-b460-f55dd00e2507","Type":"ContainerStarted","Data":"b5fcfc1186d1f2161595fbcaee85f08a99d5d5d428613c653bbe80379697c9b6"} Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.078129 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.078201 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.421539 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.427930 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b400d066-a3bb-4b85-aaa1-7ddca808de2e-memberlist\") pod \"speaker-pbqrp\" (UID: \"b400d066-a3bb-4b85-aaa1-7ddca808de2e\") " pod="metallb-system/speaker-pbqrp" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.502026 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pbqrp" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.584838 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.592814 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.598776 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.625701 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.625779 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.625838 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vjs\" (UniqueName: \"kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.726997 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.727069 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.727102 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vjs\" (UniqueName: \"kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.727951 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.728192 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.750273 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vjs\" (UniqueName: \"kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs\") pod \"community-operators-xmzjk\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.769178 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pbqrp" event={"ID":"b400d066-a3bb-4b85-aaa1-7ddca808de2e","Type":"ContainerStarted","Data":"6c097db64015c1dfd043979b5a526d8f39051ecf0bdcce5278c545da20b2c0d3"} Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.769688 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.795360 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-m47st" podStartSLOduration=2.7953370079999997 podStartE2EDuration="2.795337008s" podCreationTimestamp="2025-10-09 14:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:02:20.793470782 +0000 UTC m=+687.991329856" watchObservedRunningTime="2025-10-09 14:02:20.795337008 +0000 UTC m=+687.993196082" Oct 09 14:02:20 crc kubenswrapper[4902]: I1009 14:02:20.916187 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.551929 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:21 crc kubenswrapper[4902]: W1009 14:02:21.560532 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cdd6486_b83b_4a53_8f0c_f0d8bb84006a.slice/crio-d93aa681cf4300c3d5fcc248a45307fe23de7302bb3dc1d25b34a0959defc540 WatchSource:0}: Error finding container d93aa681cf4300c3d5fcc248a45307fe23de7302bb3dc1d25b34a0959defc540: Status 404 returned error can't find the container with id d93aa681cf4300c3d5fcc248a45307fe23de7302bb3dc1d25b34a0959defc540 Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.783533 4902 generic.go:334] "Generic (PLEG): container finished" podID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerID="dcfbf18821aff4d4001d843ef37d478c56590a640bd02f2bf1228790e6c3cb59" exitCode=0 Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.783608 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerDied","Data":"dcfbf18821aff4d4001d843ef37d478c56590a640bd02f2bf1228790e6c3cb59"} Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.783639 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerStarted","Data":"d93aa681cf4300c3d5fcc248a45307fe23de7302bb3dc1d25b34a0959defc540"} Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.793175 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pbqrp" event={"ID":"b400d066-a3bb-4b85-aaa1-7ddca808de2e","Type":"ContainerStarted","Data":"4adde5f58b62f1a609a54d0118f617ab416655625ba778f2d2b16147adefacb1"} Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.793240 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pbqrp" event={"ID":"b400d066-a3bb-4b85-aaa1-7ddca808de2e","Type":"ContainerStarted","Data":"925afd9c6e049814c80cf2c865e1260ee2673d9bda1aca282c67565046c31566"} Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.793511 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pbqrp" Oct 09 14:02:21 crc kubenswrapper[4902]: I1009 14:02:21.847964 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pbqrp" podStartSLOduration=3.847943424 podStartE2EDuration="3.847943424s" podCreationTimestamp="2025-10-09 14:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:02:21.845564663 +0000 UTC m=+689.043423737" watchObservedRunningTime="2025-10-09 14:02:21.847943424 +0000 UTC m=+689.045802488" Oct 09 14:02:23 crc kubenswrapper[4902]: I1009 14:02:23.814137 4902 generic.go:334] "Generic (PLEG): container finished" podID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerID="009cb96be328d6834acdde82f1ee55132258233d2266c22d3f315625cf1dd48d" exitCode=0 Oct 09 14:02:23 crc kubenswrapper[4902]: I1009 14:02:23.814248 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerDied","Data":"009cb96be328d6834acdde82f1ee55132258233d2266c22d3f315625cf1dd48d"} Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.833657 4902 generic.go:334] "Generic (PLEG): container finished" podID="693846cb-0606-4818-b246-e6940fa26802" containerID="c98a43fb23a88bbbe52faeb2f618320d6cd4b18b69d8be323b70525e218f7549" exitCode=0 Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.833729 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerDied","Data":"c98a43fb23a88bbbe52faeb2f618320d6cd4b18b69d8be323b70525e218f7549"} Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.848863 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerStarted","Data":"e6681768b336f0e7eecbf1868957991250daefbb535d47b4b879a40a56f8f39e"} Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.854172 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" event={"ID":"c788e9d2-cc9c-4dd8-b65d-f422358e0510","Type":"ContainerStarted","Data":"fa9c69c7b4ab5689292f517a1e3149c02ef1e200d3ec2fad1cfd8230a80dcb80"} Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.854827 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.914970 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" podStartSLOduration=1.697567776 podStartE2EDuration="8.914950523s" podCreationTimestamp="2025-10-09 14:02:18 +0000 UTC" firstStartedPulling="2025-10-09 14:02:19.348346003 +0000 UTC m=+686.546205067" lastFinishedPulling="2025-10-09 14:02:26.56572875 +0000 UTC m=+693.763587814" observedRunningTime="2025-10-09 14:02:26.910664955 +0000 UTC m=+694.108524019" watchObservedRunningTime="2025-10-09 14:02:26.914950523 +0000 UTC m=+694.112809587" Oct 09 14:02:26 crc kubenswrapper[4902]: I1009 14:02:26.931518 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xmzjk" podStartSLOduration=2.159196577 podStartE2EDuration="6.93149699s" podCreationTimestamp="2025-10-09 14:02:20 +0000 UTC" firstStartedPulling="2025-10-09 14:02:21.792771158 +0000 UTC m=+688.990630222" lastFinishedPulling="2025-10-09 14:02:26.565071531 +0000 UTC m=+693.762930635" observedRunningTime="2025-10-09 14:02:26.929315274 +0000 UTC m=+694.127174358" watchObservedRunningTime="2025-10-09 14:02:26.93149699 +0000 UTC m=+694.129356044" Oct 09 14:02:27 crc kubenswrapper[4902]: I1009 14:02:27.861546 4902 generic.go:334] "Generic (PLEG): container finished" podID="693846cb-0606-4818-b246-e6940fa26802" containerID="1d82ed8d84c576f1955363dcb6b87fa4d1138f8e34bf696fb39aee7a9d402d13" exitCode=0 Oct 09 14:02:27 crc kubenswrapper[4902]: I1009 14:02:27.861740 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerDied","Data":"1d82ed8d84c576f1955363dcb6b87fa4d1138f8e34bf696fb39aee7a9d402d13"} Oct 09 14:02:28 crc kubenswrapper[4902]: I1009 14:02:28.869034 4902 generic.go:334] "Generic (PLEG): container finished" podID="693846cb-0606-4818-b246-e6940fa26802" containerID="9529e9430630dc6d419ecf24c48ba68cb52695f7d63fd4bdbd43c3e7b624b2b8" exitCode=0 Oct 09 14:02:28 crc kubenswrapper[4902]: I1009 14:02:28.869127 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerDied","Data":"9529e9430630dc6d419ecf24c48ba68cb52695f7d63fd4bdbd43c3e7b624b2b8"} Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.020318 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-m47st" Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.880784 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"0a802356b7b75e809eff09c74c713905bc0c719d9b6373ae86918948b0a58ccf"} Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.880828 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"6d058ba99e62bf16f0baa906e949c2413d4cd59b1c96ac044cf30d03cea029da"} Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.880837 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"e2ca94ceeb7985c8c133db7fc404cf6774c7a0fecd4e2b6a6df02763a49bb5cc"} Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.880846 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"2bc2129aa96bff0b2aa583ad881a4835d46789ae371f1d039b75af513f48ff92"} Oct 09 14:02:29 crc kubenswrapper[4902]: I1009 14:02:29.880854 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"ff384f9dd0089f4bdb2fd49babeb7b57c674764181371bc7904adc8258a1dab8"} Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.505388 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pbqrp" Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.891266 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sw9n6" event={"ID":"693846cb-0606-4818-b246-e6940fa26802","Type":"ContainerStarted","Data":"6855ea9d05b8cf954661d185d5cf6e06cd4b2d204f0e6883f206b8154df998ae"} Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.891530 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.914799 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-sw9n6" podStartSLOduration=5.427783648 podStartE2EDuration="12.914778378s" podCreationTimestamp="2025-10-09 14:02:18 +0000 UTC" firstStartedPulling="2025-10-09 14:02:19.107605087 +0000 UTC m=+686.305464151" lastFinishedPulling="2025-10-09 14:02:26.594599817 +0000 UTC m=+693.792458881" observedRunningTime="2025-10-09 14:02:30.912581752 +0000 UTC m=+698.110440846" watchObservedRunningTime="2025-10-09 14:02:30.914778378 +0000 UTC m=+698.112637442" Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.922224 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.922286 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:30 crc kubenswrapper[4902]: I1009 14:02:30.993399 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:31 crc kubenswrapper[4902]: I1009 14:02:31.932909 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:31 crc kubenswrapper[4902]: I1009 14:02:31.973596 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.614459 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.615789 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.619018 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.620187 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.620249 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-98gvm" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.623979 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.721601 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thhrz\" (UniqueName: \"kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz\") pod \"openstack-operator-index-m2jjv\" (UID: \"50b235c2-7b58-4561-8b02-89e0bc4dbda9\") " pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.822509 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thhrz\" (UniqueName: \"kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz\") pod \"openstack-operator-index-m2jjv\" (UID: \"50b235c2-7b58-4561-8b02-89e0bc4dbda9\") " pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.849210 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thhrz\" (UniqueName: \"kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz\") pod \"openstack-operator-index-m2jjv\" (UID: \"50b235c2-7b58-4561-8b02-89e0bc4dbda9\") " pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.886792 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.906877 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xmzjk" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="registry-server" containerID="cri-o://e6681768b336f0e7eecbf1868957991250daefbb535d47b4b879a40a56f8f39e" gracePeriod=2 Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.925317 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:33 crc kubenswrapper[4902]: I1009 14:02:33.962230 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:34 crc kubenswrapper[4902]: I1009 14:02:34.394214 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:34 crc kubenswrapper[4902]: I1009 14:02:34.926432 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m2jjv" event={"ID":"50b235c2-7b58-4561-8b02-89e0bc4dbda9","Type":"ContainerStarted","Data":"5319dc4ea5743c804ac84dc9695505517a73ab19ace52dc250ce8dbd730c744e"} Oct 09 14:02:34 crc kubenswrapper[4902]: I1009 14:02:34.932311 4902 generic.go:334] "Generic (PLEG): container finished" podID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerID="e6681768b336f0e7eecbf1868957991250daefbb535d47b4b879a40a56f8f39e" exitCode=0 Oct 09 14:02:34 crc kubenswrapper[4902]: I1009 14:02:34.932352 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerDied","Data":"e6681768b336f0e7eecbf1868957991250daefbb535d47b4b879a40a56f8f39e"} Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.313553 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.448162 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8vjs\" (UniqueName: \"kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs\") pod \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.448256 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities\") pod \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.448280 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content\") pod \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\" (UID: \"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a\") " Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.449765 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities" (OuterVolumeSpecName: "utilities") pod "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" (UID: "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.458719 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs" (OuterVolumeSpecName: "kube-api-access-n8vjs") pod "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" (UID: "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a"). InnerVolumeSpecName "kube-api-access-n8vjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.501090 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" (UID: "8cdd6486-b83b-4a53-8f0c-f0d8bb84006a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.550539 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.550583 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.550600 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8vjs\" (UniqueName: \"kubernetes.io/projected/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a-kube-api-access-n8vjs\") on node \"crc\" DevicePath \"\"" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.940878 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmzjk" event={"ID":"8cdd6486-b83b-4a53-8f0c-f0d8bb84006a","Type":"ContainerDied","Data":"d93aa681cf4300c3d5fcc248a45307fe23de7302bb3dc1d25b34a0959defc540"} Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.941178 4902 scope.go:117] "RemoveContainer" containerID="e6681768b336f0e7eecbf1868957991250daefbb535d47b4b879a40a56f8f39e" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.941472 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmzjk" Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.961965 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:35 crc kubenswrapper[4902]: I1009 14:02:35.966549 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xmzjk"] Oct 09 14:02:36 crc kubenswrapper[4902]: I1009 14:02:36.070562 4902 scope.go:117] "RemoveContainer" containerID="009cb96be328d6834acdde82f1ee55132258233d2266c22d3f315625cf1dd48d" Oct 09 14:02:36 crc kubenswrapper[4902]: I1009 14:02:36.428976 4902 scope.go:117] "RemoveContainer" containerID="dcfbf18821aff4d4001d843ef37d478c56590a640bd02f2bf1228790e6c3cb59" Oct 09 14:02:37 crc kubenswrapper[4902]: I1009 14:02:37.524357 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" path="/var/lib/kubelet/pods/8cdd6486-b83b-4a53-8f0c-f0d8bb84006a/volumes" Oct 09 14:02:37 crc kubenswrapper[4902]: I1009 14:02:37.954378 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m2jjv" event={"ID":"50b235c2-7b58-4561-8b02-89e0bc4dbda9","Type":"ContainerStarted","Data":"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296"} Oct 09 14:02:37 crc kubenswrapper[4902]: I1009 14:02:37.973366 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m2jjv" podStartSLOduration=1.830774146 podStartE2EDuration="4.973349318s" podCreationTimestamp="2025-10-09 14:02:33 +0000 UTC" firstStartedPulling="2025-10-09 14:02:34.404645795 +0000 UTC m=+701.602504859" lastFinishedPulling="2025-10-09 14:02:37.547220967 +0000 UTC m=+704.745080031" observedRunningTime="2025-10-09 14:02:37.969625346 +0000 UTC m=+705.167484430" watchObservedRunningTime="2025-10-09 14:02:37.973349318 +0000 UTC m=+705.171208382" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.036626 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.839528 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dv7t7"] Oct 09 14:02:38 crc kubenswrapper[4902]: E1009 14:02:38.839829 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="extract-content" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.839846 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="extract-content" Oct 09 14:02:38 crc kubenswrapper[4902]: E1009 14:02:38.839863 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="registry-server" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.839872 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="registry-server" Oct 09 14:02:38 crc kubenswrapper[4902]: E1009 14:02:38.839887 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="extract-utilities" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.839947 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="extract-utilities" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.840079 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdd6486-b83b-4a53-8f0c-f0d8bb84006a" containerName="registry-server" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.841950 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.846656 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dv7t7"] Oct 09 14:02:38 crc kubenswrapper[4902]: I1009 14:02:38.906837 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-v4t7k" Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.003609 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dvc\" (UniqueName: \"kubernetes.io/projected/468c32be-1138-4600-bcd2-85aa8b02ec69-kube-api-access-k4dvc\") pod \"openstack-operator-index-dv7t7\" (UID: \"468c32be-1138-4600-bcd2-85aa8b02ec69\") " pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.105743 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dvc\" (UniqueName: \"kubernetes.io/projected/468c32be-1138-4600-bcd2-85aa8b02ec69-kube-api-access-k4dvc\") pod \"openstack-operator-index-dv7t7\" (UID: \"468c32be-1138-4600-bcd2-85aa8b02ec69\") " pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.133722 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dvc\" (UniqueName: \"kubernetes.io/projected/468c32be-1138-4600-bcd2-85aa8b02ec69-kube-api-access-k4dvc\") pod \"openstack-operator-index-dv7t7\" (UID: \"468c32be-1138-4600-bcd2-85aa8b02ec69\") " pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.160216 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.570228 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dv7t7"] Oct 09 14:02:39 crc kubenswrapper[4902]: W1009 14:02:39.574697 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod468c32be_1138_4600_bcd2_85aa8b02ec69.slice/crio-56483296f271bfbdb2b0a00f3b0118dc78924ed133ae5358cb7018ae4fc51d9f WatchSource:0}: Error finding container 56483296f271bfbdb2b0a00f3b0118dc78924ed133ae5358cb7018ae4fc51d9f: Status 404 returned error can't find the container with id 56483296f271bfbdb2b0a00f3b0118dc78924ed133ae5358cb7018ae4fc51d9f Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.968630 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dv7t7" event={"ID":"468c32be-1138-4600-bcd2-85aa8b02ec69","Type":"ContainerStarted","Data":"3691707b429f1d76af7eeb58582497b776b7eee3a742c0c29d69d23cba816a60"} Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.968938 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dv7t7" event={"ID":"468c32be-1138-4600-bcd2-85aa8b02ec69","Type":"ContainerStarted","Data":"56483296f271bfbdb2b0a00f3b0118dc78924ed133ae5358cb7018ae4fc51d9f"} Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.968769 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-m2jjv" podUID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" containerName="registry-server" containerID="cri-o://7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296" gracePeriod=2 Oct 09 14:02:39 crc kubenswrapper[4902]: I1009 14:02:39.995278 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dv7t7" podStartSLOduration=1.931777224 podStartE2EDuration="1.9952545s" podCreationTimestamp="2025-10-09 14:02:38 +0000 UTC" firstStartedPulling="2025-10-09 14:02:39.579470869 +0000 UTC m=+706.777329953" lastFinishedPulling="2025-10-09 14:02:39.642948155 +0000 UTC m=+706.840807229" observedRunningTime="2025-10-09 14:02:39.990743105 +0000 UTC m=+707.188602199" watchObservedRunningTime="2025-10-09 14:02:39.9952545 +0000 UTC m=+707.193113564" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.339134 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.421947 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thhrz\" (UniqueName: \"kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz\") pod \"50b235c2-7b58-4561-8b02-89e0bc4dbda9\" (UID: \"50b235c2-7b58-4561-8b02-89e0bc4dbda9\") " Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.440650 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz" (OuterVolumeSpecName: "kube-api-access-thhrz") pod "50b235c2-7b58-4561-8b02-89e0bc4dbda9" (UID: "50b235c2-7b58-4561-8b02-89e0bc4dbda9"). InnerVolumeSpecName "kube-api-access-thhrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.523696 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thhrz\" (UniqueName: \"kubernetes.io/projected/50b235c2-7b58-4561-8b02-89e0bc4dbda9-kube-api-access-thhrz\") on node \"crc\" DevicePath \"\"" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.976665 4902 generic.go:334] "Generic (PLEG): container finished" podID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" containerID="7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296" exitCode=0 Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.977350 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m2jjv" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.977552 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m2jjv" event={"ID":"50b235c2-7b58-4561-8b02-89e0bc4dbda9","Type":"ContainerDied","Data":"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296"} Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.977590 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m2jjv" event={"ID":"50b235c2-7b58-4561-8b02-89e0bc4dbda9","Type":"ContainerDied","Data":"5319dc4ea5743c804ac84dc9695505517a73ab19ace52dc250ce8dbd730c744e"} Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.977609 4902 scope.go:117] "RemoveContainer" containerID="7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296" Oct 09 14:02:40 crc kubenswrapper[4902]: I1009 14:02:40.999023 4902 scope.go:117] "RemoveContainer" containerID="7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296" Oct 09 14:02:41 crc kubenswrapper[4902]: E1009 14:02:41.000131 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296\": container with ID starting with 7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296 not found: ID does not exist" containerID="7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296" Oct 09 14:02:41 crc kubenswrapper[4902]: I1009 14:02:41.000231 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296"} err="failed to get container status \"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296\": rpc error: code = NotFound desc = could not find container \"7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296\": container with ID starting with 7558fa57e997f11ee508f2475ef5f2a6100edf6464b51539014467b36bf1f296 not found: ID does not exist" Oct 09 14:02:41 crc kubenswrapper[4902]: I1009 14:02:41.015759 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:41 crc kubenswrapper[4902]: I1009 14:02:41.025228 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-m2jjv"] Oct 09 14:02:41 crc kubenswrapper[4902]: I1009 14:02:41.530151 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" path="/var/lib/kubelet/pods/50b235c2-7b58-4561-8b02-89e0bc4dbda9/volumes" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.241740 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:02:45 crc kubenswrapper[4902]: E1009 14:02:45.242343 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" containerName="registry-server" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.242361 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" containerName="registry-server" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.242522 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b235c2-7b58-4561-8b02-89e0bc4dbda9" containerName="registry-server" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.243567 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.253829 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.396442 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.396578 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vn5c\" (UniqueName: \"kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.396645 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.498349 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vn5c\" (UniqueName: \"kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.498463 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.498563 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.499374 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.499424 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.533361 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vn5c\" (UniqueName: \"kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c\") pod \"certified-operators-h2d56\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:45 crc kubenswrapper[4902]: I1009 14:02:45.574133 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:46 crc kubenswrapper[4902]: I1009 14:02:46.119610 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:02:47 crc kubenswrapper[4902]: I1009 14:02:47.017564 4902 generic.go:334] "Generic (PLEG): container finished" podID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerID="ed589e7efe024dc35c26c8f70bfef1129cababe48463a6fbfda56625ffbe02bd" exitCode=0 Oct 09 14:02:47 crc kubenswrapper[4902]: I1009 14:02:47.017677 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerDied","Data":"ed589e7efe024dc35c26c8f70bfef1129cababe48463a6fbfda56625ffbe02bd"} Oct 09 14:02:47 crc kubenswrapper[4902]: I1009 14:02:47.017914 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerStarted","Data":"ddbfaf489c3754e30936a0e28a1f7ecab5e531ad8f7ba8eafc0135c5e9fcbc5d"} Oct 09 14:02:48 crc kubenswrapper[4902]: I1009 14:02:48.888545 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-sw9n6" Oct 09 14:02:49 crc kubenswrapper[4902]: I1009 14:02:49.160786 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:49 crc kubenswrapper[4902]: I1009 14:02:49.161577 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:49 crc kubenswrapper[4902]: I1009 14:02:49.189779 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.038593 4902 generic.go:334] "Generic (PLEG): container finished" podID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerID="5ea129aefdad88e913bdf23e813e866aaf2b964bdb202a6cf7d28e3f6d618a57" exitCode=0 Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.038697 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerDied","Data":"5ea129aefdad88e913bdf23e813e866aaf2b964bdb202a6cf7d28e3f6d618a57"} Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.063133 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dv7t7" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.078163 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.078215 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.244982 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.248091 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.254190 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.375335 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.375762 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.375860 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjm9m\" (UniqueName: \"kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.477369 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.477448 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.477525 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjm9m\" (UniqueName: \"kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.477984 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.478037 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.498786 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjm9m\" (UniqueName: \"kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m\") pod \"redhat-operators-bpwgs\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.569964 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:02:50 crc kubenswrapper[4902]: W1009 14:02:50.984587 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod377ef545_a42b_4815_a387_ce427c4b0d44.slice/crio-0f43e7e71b24e26d668790981d9033498872d431c40b44322a255b63399705e2 WatchSource:0}: Error finding container 0f43e7e71b24e26d668790981d9033498872d431c40b44322a255b63399705e2: Status 404 returned error can't find the container with id 0f43e7e71b24e26d668790981d9033498872d431c40b44322a255b63399705e2 Oct 09 14:02:50 crc kubenswrapper[4902]: I1009 14:02:50.984845 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:02:51 crc kubenswrapper[4902]: I1009 14:02:51.050067 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerStarted","Data":"a152eecae4f2765f3cc3aa78af3a918b4da67d6878522dbe301fc250a8e3bdb7"} Oct 09 14:02:51 crc kubenswrapper[4902]: I1009 14:02:51.051885 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerStarted","Data":"0f43e7e71b24e26d668790981d9033498872d431c40b44322a255b63399705e2"} Oct 09 14:02:51 crc kubenswrapper[4902]: I1009 14:02:51.070580 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h2d56" podStartSLOduration=2.326898631 podStartE2EDuration="6.0705575s" podCreationTimestamp="2025-10-09 14:02:45 +0000 UTC" firstStartedPulling="2025-10-09 14:02:47.019520839 +0000 UTC m=+714.217379923" lastFinishedPulling="2025-10-09 14:02:50.763179728 +0000 UTC m=+717.961038792" observedRunningTime="2025-10-09 14:02:51.068884611 +0000 UTC m=+718.266743695" watchObservedRunningTime="2025-10-09 14:02:51.0705575 +0000 UTC m=+718.268416574" Oct 09 14:02:52 crc kubenswrapper[4902]: I1009 14:02:52.058743 4902 generic.go:334] "Generic (PLEG): container finished" podID="377ef545-a42b-4815-a387-ce427c4b0d44" containerID="63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950" exitCode=0 Oct 09 14:02:52 crc kubenswrapper[4902]: I1009 14:02:52.058804 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerDied","Data":"63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950"} Oct 09 14:02:53 crc kubenswrapper[4902]: I1009 14:02:53.068121 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerStarted","Data":"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956"} Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.081038 4902 generic.go:334] "Generic (PLEG): container finished" podID="377ef545-a42b-4815-a387-ce427c4b0d44" containerID="b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956" exitCode=0 Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.081647 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerDied","Data":"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956"} Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.440015 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.441423 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.456338 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.534229 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.534311 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h99qr\" (UniqueName: \"kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.534344 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.643133 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.643230 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.643249 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h99qr\" (UniqueName: \"kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.643754 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.643815 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.665518 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h99qr\" (UniqueName: \"kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr\") pod \"redhat-marketplace-szjlw\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:54 crc kubenswrapper[4902]: I1009 14:02:54.759153 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.107355 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerStarted","Data":"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66"} Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.133797 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bpwgs" podStartSLOduration=2.582899012 podStartE2EDuration="5.133781344s" podCreationTimestamp="2025-10-09 14:02:50 +0000 UTC" firstStartedPulling="2025-10-09 14:02:52.060838626 +0000 UTC m=+719.258697690" lastFinishedPulling="2025-10-09 14:02:54.611720958 +0000 UTC m=+721.809580022" observedRunningTime="2025-10-09 14:02:55.131165508 +0000 UTC m=+722.329024582" watchObservedRunningTime="2025-10-09 14:02:55.133781344 +0000 UTC m=+722.331640408" Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.210012 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:02:55 crc kubenswrapper[4902]: W1009 14:02:55.220713 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfc96e2_757c_4c2c_aa02_ba62993dc45d.slice/crio-daf5f17b08752d648d8451d78a24ef9929d6502fac12f8d8db8130d472c92dcc WatchSource:0}: Error finding container daf5f17b08752d648d8451d78a24ef9929d6502fac12f8d8db8130d472c92dcc: Status 404 returned error can't find the container with id daf5f17b08752d648d8451d78a24ef9929d6502fac12f8d8db8130d472c92dcc Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.574726 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.575891 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:55 crc kubenswrapper[4902]: I1009 14:02:55.631417 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:56 crc kubenswrapper[4902]: I1009 14:02:56.115354 4902 generic.go:334] "Generic (PLEG): container finished" podID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerID="1579a6a75ca88d0063c3c53b18344ac6cc459e8840beaa584f0eda718007b646" exitCode=0 Oct 09 14:02:56 crc kubenswrapper[4902]: I1009 14:02:56.116365 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerDied","Data":"1579a6a75ca88d0063c3c53b18344ac6cc459e8840beaa584f0eda718007b646"} Oct 09 14:02:56 crc kubenswrapper[4902]: I1009 14:02:56.116389 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerStarted","Data":"daf5f17b08752d648d8451d78a24ef9929d6502fac12f8d8db8130d472c92dcc"} Oct 09 14:02:56 crc kubenswrapper[4902]: I1009 14:02:56.163119 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:56 crc kubenswrapper[4902]: I1009 14:02:56.839954 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:02:58 crc kubenswrapper[4902]: I1009 14:02:58.136088 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h2d56" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="registry-server" containerID="cri-o://a152eecae4f2765f3cc3aa78af3a918b4da67d6878522dbe301fc250a8e3bdb7" gracePeriod=2 Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.144742 4902 generic.go:334] "Generic (PLEG): container finished" podID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerID="a152eecae4f2765f3cc3aa78af3a918b4da67d6878522dbe301fc250a8e3bdb7" exitCode=0 Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.144825 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerDied","Data":"a152eecae4f2765f3cc3aa78af3a918b4da67d6878522dbe301fc250a8e3bdb7"} Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.148157 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerStarted","Data":"018b9931e07ae9b4c0a3321c8bbfefcae99e535469242897f03f4b8719242f33"} Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.618663 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.719744 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content\") pod \"2161e0ca-6032-434f-8b57-e8d9d66399ed\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.719959 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vn5c\" (UniqueName: \"kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c\") pod \"2161e0ca-6032-434f-8b57-e8d9d66399ed\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.720053 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities\") pod \"2161e0ca-6032-434f-8b57-e8d9d66399ed\" (UID: \"2161e0ca-6032-434f-8b57-e8d9d66399ed\") " Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.721069 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities" (OuterVolumeSpecName: "utilities") pod "2161e0ca-6032-434f-8b57-e8d9d66399ed" (UID: "2161e0ca-6032-434f-8b57-e8d9d66399ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.725951 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c" (OuterVolumeSpecName: "kube-api-access-5vn5c") pod "2161e0ca-6032-434f-8b57-e8d9d66399ed" (UID: "2161e0ca-6032-434f-8b57-e8d9d66399ed"). InnerVolumeSpecName "kube-api-access-5vn5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.821589 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vn5c\" (UniqueName: \"kubernetes.io/projected/2161e0ca-6032-434f-8b57-e8d9d66399ed-kube-api-access-5vn5c\") on node \"crc\" DevicePath \"\"" Oct 09 14:02:59 crc kubenswrapper[4902]: I1009 14:02:59.821633 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.068008 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2161e0ca-6032-434f-8b57-e8d9d66399ed" (UID: "2161e0ca-6032-434f-8b57-e8d9d66399ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.126756 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2161e0ca-6032-434f-8b57-e8d9d66399ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.155615 4902 generic.go:334] "Generic (PLEG): container finished" podID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerID="018b9931e07ae9b4c0a3321c8bbfefcae99e535469242897f03f4b8719242f33" exitCode=0 Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.155697 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerDied","Data":"018b9931e07ae9b4c0a3321c8bbfefcae99e535469242897f03f4b8719242f33"} Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.158190 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2d56" event={"ID":"2161e0ca-6032-434f-8b57-e8d9d66399ed","Type":"ContainerDied","Data":"ddbfaf489c3754e30936a0e28a1f7ecab5e531ad8f7ba8eafc0135c5e9fcbc5d"} Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.158239 4902 scope.go:117] "RemoveContainer" containerID="a152eecae4f2765f3cc3aa78af3a918b4da67d6878522dbe301fc250a8e3bdb7" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.158368 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2d56" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.188522 4902 scope.go:117] "RemoveContainer" containerID="5ea129aefdad88e913bdf23e813e866aaf2b964bdb202a6cf7d28e3f6d618a57" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.204703 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.208078 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h2d56"] Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.210343 4902 scope.go:117] "RemoveContainer" containerID="ed589e7efe024dc35c26c8f70bfef1129cababe48463a6fbfda56625ffbe02bd" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.570965 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.571041 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:00 crc kubenswrapper[4902]: I1009 14:03:00.612641 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:01 crc kubenswrapper[4902]: I1009 14:03:01.213888 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:01 crc kubenswrapper[4902]: I1009 14:03:01.520864 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" path="/var/lib/kubelet/pods/2161e0ca-6032-434f-8b57-e8d9d66399ed/volumes" Oct 09 14:03:02 crc kubenswrapper[4902]: I1009 14:03:02.177355 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerStarted","Data":"e31fc402005302af6d4a314fa166bd298b2fe629f7a2f5062bf24ed468f7e17f"} Oct 09 14:03:02 crc kubenswrapper[4902]: I1009 14:03:02.194899 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szjlw" podStartSLOduration=3.473884036 podStartE2EDuration="8.194872573s" podCreationTimestamp="2025-10-09 14:02:54 +0000 UTC" firstStartedPulling="2025-10-09 14:02:57.122903616 +0000 UTC m=+724.320762680" lastFinishedPulling="2025-10-09 14:03:01.843892153 +0000 UTC m=+729.041751217" observedRunningTime="2025-10-09 14:03:02.194312157 +0000 UTC m=+729.392171221" watchObservedRunningTime="2025-10-09 14:03:02.194872573 +0000 UTC m=+729.392731637" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.689594 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq"] Oct 09 14:03:03 crc kubenswrapper[4902]: E1009 14:03:03.690277 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="extract-utilities" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.690291 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="extract-utilities" Oct 09 14:03:03 crc kubenswrapper[4902]: E1009 14:03:03.690305 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="extract-content" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.690311 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="extract-content" Oct 09 14:03:03 crc kubenswrapper[4902]: E1009 14:03:03.690334 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="registry-server" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.690340 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="registry-server" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.690501 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="2161e0ca-6032-434f-8b57-e8d9d66399ed" containerName="registry-server" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.691442 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.694204 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x5lsd" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.703648 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq"] Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.785286 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.785365 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.785398 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvz6t\" (UniqueName: \"kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.886904 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.886996 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.887033 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvz6t\" (UniqueName: \"kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.887597 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.889270 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:03 crc kubenswrapper[4902]: I1009 14:03:03.907858 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvz6t\" (UniqueName: \"kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t\") pod \"f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:04 crc kubenswrapper[4902]: I1009 14:03:04.009075 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:04 crc kubenswrapper[4902]: I1009 14:03:04.318742 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq"] Oct 09 14:03:04 crc kubenswrapper[4902]: W1009 14:03:04.322559 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b9e579a_5f0a_4d30_8792_fd71075c1479.slice/crio-aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a WatchSource:0}: Error finding container aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a: Status 404 returned error can't find the container with id aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a Oct 09 14:03:04 crc kubenswrapper[4902]: I1009 14:03:04.760437 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:04 crc kubenswrapper[4902]: I1009 14:03:04.760512 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:04 crc kubenswrapper[4902]: I1009 14:03:04.807404 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.198976 4902 generic.go:334] "Generic (PLEG): container finished" podID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerID="37ae1d093b4c2b92bde7b68f88e1227671079cd5678ad53939d5d8982b6d7db7" exitCode=0 Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.199103 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" event={"ID":"4b9e579a-5f0a-4d30-8792-fd71075c1479","Type":"ContainerDied","Data":"37ae1d093b4c2b92bde7b68f88e1227671079cd5678ad53939d5d8982b6d7db7"} Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.211554 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" event={"ID":"4b9e579a-5f0a-4d30-8792-fd71075c1479","Type":"ContainerStarted","Data":"aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a"} Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.249075 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.249399 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bpwgs" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="registry-server" containerID="cri-o://86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66" gracePeriod=2 Oct 09 14:03:05 crc kubenswrapper[4902]: I1009 14:03:05.922884 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.035308 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content\") pod \"377ef545-a42b-4815-a387-ce427c4b0d44\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.035371 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjm9m\" (UniqueName: \"kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m\") pod \"377ef545-a42b-4815-a387-ce427c4b0d44\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.035507 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities\") pod \"377ef545-a42b-4815-a387-ce427c4b0d44\" (UID: \"377ef545-a42b-4815-a387-ce427c4b0d44\") " Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.036523 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities" (OuterVolumeSpecName: "utilities") pod "377ef545-a42b-4815-a387-ce427c4b0d44" (UID: "377ef545-a42b-4815-a387-ce427c4b0d44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.044632 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m" (OuterVolumeSpecName: "kube-api-access-qjm9m") pod "377ef545-a42b-4815-a387-ce427c4b0d44" (UID: "377ef545-a42b-4815-a387-ce427c4b0d44"). InnerVolumeSpecName "kube-api-access-qjm9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.134272 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "377ef545-a42b-4815-a387-ce427c4b0d44" (UID: "377ef545-a42b-4815-a387-ce427c4b0d44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.138330 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.138398 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377ef545-a42b-4815-a387-ce427c4b0d44-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.138432 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjm9m\" (UniqueName: \"kubernetes.io/projected/377ef545-a42b-4815-a387-ce427c4b0d44-kube-api-access-qjm9m\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.210898 4902 generic.go:334] "Generic (PLEG): container finished" podID="377ef545-a42b-4815-a387-ce427c4b0d44" containerID="86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66" exitCode=0 Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.210998 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerDied","Data":"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66"} Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.211013 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpwgs" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.211042 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpwgs" event={"ID":"377ef545-a42b-4815-a387-ce427c4b0d44","Type":"ContainerDied","Data":"0f43e7e71b24e26d668790981d9033498872d431c40b44322a255b63399705e2"} Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.211080 4902 scope.go:117] "RemoveContainer" containerID="86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.215968 4902 generic.go:334] "Generic (PLEG): container finished" podID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerID="afc6eb76a0400a858fbcf400768254e381d40af4fa9456fb35ddff4d30bda537" exitCode=0 Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.216020 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" event={"ID":"4b9e579a-5f0a-4d30-8792-fd71075c1479","Type":"ContainerDied","Data":"afc6eb76a0400a858fbcf400768254e381d40af4fa9456fb35ddff4d30bda537"} Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.236018 4902 scope.go:117] "RemoveContainer" containerID="b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.272678 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.276560 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bpwgs"] Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.280125 4902 scope.go:117] "RemoveContainer" containerID="63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.309816 4902 scope.go:117] "RemoveContainer" containerID="86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66" Oct 09 14:03:06 crc kubenswrapper[4902]: E1009 14:03:06.310315 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66\": container with ID starting with 86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66 not found: ID does not exist" containerID="86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.310365 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66"} err="failed to get container status \"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66\": rpc error: code = NotFound desc = could not find container \"86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66\": container with ID starting with 86c30a3fa27e1d50a86668715e53b5f3d9291a5b1e4d6f23ca340dadf0a99c66 not found: ID does not exist" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.310395 4902 scope.go:117] "RemoveContainer" containerID="b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956" Oct 09 14:03:06 crc kubenswrapper[4902]: E1009 14:03:06.310716 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956\": container with ID starting with b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956 not found: ID does not exist" containerID="b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.310753 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956"} err="failed to get container status \"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956\": rpc error: code = NotFound desc = could not find container \"b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956\": container with ID starting with b78fd39a500c8b1ce300c38661cfc437f7dedd8f096f5efbb555bb7c5d11c956 not found: ID does not exist" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.310796 4902 scope.go:117] "RemoveContainer" containerID="63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950" Oct 09 14:03:06 crc kubenswrapper[4902]: E1009 14:03:06.314627 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950\": container with ID starting with 63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950 not found: ID does not exist" containerID="63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950" Oct 09 14:03:06 crc kubenswrapper[4902]: I1009 14:03:06.314671 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950"} err="failed to get container status \"63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950\": rpc error: code = NotFound desc = could not find container \"63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950\": container with ID starting with 63f6e8be57a18274e9993aa2c5329d0dfb707c16f2e68dc6fa3fde6174a88950 not found: ID does not exist" Oct 09 14:03:07 crc kubenswrapper[4902]: I1009 14:03:07.228453 4902 generic.go:334] "Generic (PLEG): container finished" podID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerID="40a5a559bddb82e057eac282d8f6132d49b501ec2a5c0a6e94fdea879f868494" exitCode=0 Oct 09 14:03:07 crc kubenswrapper[4902]: I1009 14:03:07.228513 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" event={"ID":"4b9e579a-5f0a-4d30-8792-fd71075c1479","Type":"ContainerDied","Data":"40a5a559bddb82e057eac282d8f6132d49b501ec2a5c0a6e94fdea879f868494"} Oct 09 14:03:07 crc kubenswrapper[4902]: I1009 14:03:07.520984 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" path="/var/lib/kubelet/pods/377ef545-a42b-4815-a387-ce427c4b0d44/volumes" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.523227 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.676827 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle\") pod \"4b9e579a-5f0a-4d30-8792-fd71075c1479\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.676993 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvz6t\" (UniqueName: \"kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t\") pod \"4b9e579a-5f0a-4d30-8792-fd71075c1479\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.677053 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util\") pod \"4b9e579a-5f0a-4d30-8792-fd71075c1479\" (UID: \"4b9e579a-5f0a-4d30-8792-fd71075c1479\") " Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.678560 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle" (OuterVolumeSpecName: "bundle") pod "4b9e579a-5f0a-4d30-8792-fd71075c1479" (UID: "4b9e579a-5f0a-4d30-8792-fd71075c1479"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.683112 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t" (OuterVolumeSpecName: "kube-api-access-jvz6t") pod "4b9e579a-5f0a-4d30-8792-fd71075c1479" (UID: "4b9e579a-5f0a-4d30-8792-fd71075c1479"). InnerVolumeSpecName "kube-api-access-jvz6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.693229 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util" (OuterVolumeSpecName: "util") pod "4b9e579a-5f0a-4d30-8792-fd71075c1479" (UID: "4b9e579a-5f0a-4d30-8792-fd71075c1479"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.778636 4902 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.779533 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvz6t\" (UniqueName: \"kubernetes.io/projected/4b9e579a-5f0a-4d30-8792-fd71075c1479-kube-api-access-jvz6t\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:08 crc kubenswrapper[4902]: I1009 14:03:08.779565 4902 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b9e579a-5f0a-4d30-8792-fd71075c1479-util\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:09 crc kubenswrapper[4902]: I1009 14:03:09.252197 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" event={"ID":"4b9e579a-5f0a-4d30-8792-fd71075c1479","Type":"ContainerDied","Data":"aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a"} Oct 09 14:03:09 crc kubenswrapper[4902]: I1009 14:03:09.252238 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aded701305c48c0f8b3088ff5f408db2f25770114e5b7921f362720618514f9a" Oct 09 14:03:09 crc kubenswrapper[4902]: I1009 14:03:09.252275 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.824051 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876364 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk"] Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876733 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="extract-utilities" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876755 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="extract-utilities" Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876775 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="pull" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876783 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="pull" Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876799 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="extract" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876807 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="extract" Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876818 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="registry-server" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876827 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="registry-server" Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876839 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="extract-content" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876847 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="extract-content" Oct 09 14:03:14 crc kubenswrapper[4902]: E1009 14:03:14.876862 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="util" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.876870 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="util" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.877021 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="377ef545-a42b-4815-a387-ce427c4b0d44" containerName="registry-server" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.877038 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9e579a-5f0a-4d30-8792-fd71075c1479" containerName="extract" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.877961 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.881848 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-cr9rn" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.900264 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckn8d\" (UniqueName: \"kubernetes.io/projected/994ae404-6c3b-499c-b51d-e5a0eea83756-kube-api-access-ckn8d\") pod \"openstack-operator-controller-operator-647744f6c-bzqqk\" (UID: \"994ae404-6c3b-499c-b51d-e5a0eea83756\") " pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:14 crc kubenswrapper[4902]: I1009 14:03:14.905781 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk"] Oct 09 14:03:15 crc kubenswrapper[4902]: I1009 14:03:15.001660 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckn8d\" (UniqueName: \"kubernetes.io/projected/994ae404-6c3b-499c-b51d-e5a0eea83756-kube-api-access-ckn8d\") pod \"openstack-operator-controller-operator-647744f6c-bzqqk\" (UID: \"994ae404-6c3b-499c-b51d-e5a0eea83756\") " pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:15 crc kubenswrapper[4902]: I1009 14:03:15.022061 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckn8d\" (UniqueName: \"kubernetes.io/projected/994ae404-6c3b-499c-b51d-e5a0eea83756-kube-api-access-ckn8d\") pod \"openstack-operator-controller-operator-647744f6c-bzqqk\" (UID: \"994ae404-6c3b-499c-b51d-e5a0eea83756\") " pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:15 crc kubenswrapper[4902]: I1009 14:03:15.201950 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:15 crc kubenswrapper[4902]: I1009 14:03:15.448732 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk"] Oct 09 14:03:16 crc kubenswrapper[4902]: I1009 14:03:16.307166 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" event={"ID":"994ae404-6c3b-499c-b51d-e5a0eea83756","Type":"ContainerStarted","Data":"74f1984d389bce434cc87a80e64a5f0edd507f569270ec787953b6c138f5beaf"} Oct 09 14:03:16 crc kubenswrapper[4902]: I1009 14:03:16.835327 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:03:16 crc kubenswrapper[4902]: I1009 14:03:16.835622 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szjlw" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="registry-server" containerID="cri-o://e31fc402005302af6d4a314fa166bd298b2fe629f7a2f5062bf24ed468f7e17f" gracePeriod=2 Oct 09 14:03:17 crc kubenswrapper[4902]: I1009 14:03:17.317842 4902 generic.go:334] "Generic (PLEG): container finished" podID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerID="e31fc402005302af6d4a314fa166bd298b2fe629f7a2f5062bf24ed468f7e17f" exitCode=0 Oct 09 14:03:17 crc kubenswrapper[4902]: I1009 14:03:17.317927 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerDied","Data":"e31fc402005302af6d4a314fa166bd298b2fe629f7a2f5062bf24ed468f7e17f"} Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.078702 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.079212 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.079300 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.079999 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.080078 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd" gracePeriod=600 Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.338302 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd" exitCode=0 Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.338346 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd"} Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.338378 4902 scope.go:117] "RemoveContainer" containerID="f04c452240506df7a71fcb78dd8a43d0fd5718ad3d38cb3de0f83c0e40d74e5b" Oct 09 14:03:20 crc kubenswrapper[4902]: I1009 14:03:20.969516 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.001768 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities\") pod \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.001980 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h99qr\" (UniqueName: \"kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr\") pod \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.002031 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content\") pod \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\" (UID: \"5dfc96e2-757c-4c2c-aa02-ba62993dc45d\") " Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.003687 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities" (OuterVolumeSpecName: "utilities") pod "5dfc96e2-757c-4c2c-aa02-ba62993dc45d" (UID: "5dfc96e2-757c-4c2c-aa02-ba62993dc45d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.008934 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr" (OuterVolumeSpecName: "kube-api-access-h99qr") pod "5dfc96e2-757c-4c2c-aa02-ba62993dc45d" (UID: "5dfc96e2-757c-4c2c-aa02-ba62993dc45d"). InnerVolumeSpecName "kube-api-access-h99qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.017782 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dfc96e2-757c-4c2c-aa02-ba62993dc45d" (UID: "5dfc96e2-757c-4c2c-aa02-ba62993dc45d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.103107 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h99qr\" (UniqueName: \"kubernetes.io/projected/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-kube-api-access-h99qr\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.103155 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.103168 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfc96e2-757c-4c2c-aa02-ba62993dc45d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.351996 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szjlw" event={"ID":"5dfc96e2-757c-4c2c-aa02-ba62993dc45d","Type":"ContainerDied","Data":"daf5f17b08752d648d8451d78a24ef9929d6502fac12f8d8db8130d472c92dcc"} Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.352072 4902 scope.go:117] "RemoveContainer" containerID="e31fc402005302af6d4a314fa166bd298b2fe629f7a2f5062bf24ed468f7e17f" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.352092 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szjlw" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.380193 4902 scope.go:117] "RemoveContainer" containerID="018b9931e07ae9b4c0a3321c8bbfefcae99e535469242897f03f4b8719242f33" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.396364 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.398906 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szjlw"] Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.430561 4902 scope.go:117] "RemoveContainer" containerID="1579a6a75ca88d0063c3c53b18344ac6cc459e8840beaa584f0eda718007b646" Oct 09 14:03:21 crc kubenswrapper[4902]: I1009 14:03:21.520165 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" path="/var/lib/kubelet/pods/5dfc96e2-757c-4c2c-aa02-ba62993dc45d/volumes" Oct 09 14:03:22 crc kubenswrapper[4902]: I1009 14:03:22.373015 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726"} Oct 09 14:03:23 crc kubenswrapper[4902]: I1009 14:03:23.381341 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" event={"ID":"994ae404-6c3b-499c-b51d-e5a0eea83756","Type":"ContainerStarted","Data":"80df13817b84a5a3579deb26c14fff752f4ce1e93dfdb8797f09a0847bc41dbc"} Oct 09 14:03:31 crc kubenswrapper[4902]: I1009 14:03:31.437366 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" event={"ID":"994ae404-6c3b-499c-b51d-e5a0eea83756","Type":"ContainerStarted","Data":"ac5f0f9f58e531f19cb94b0be60b39f3f0cab5034cf85ded9df0eceed1af6ec1"} Oct 09 14:03:31 crc kubenswrapper[4902]: I1009 14:03:31.439065 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:31 crc kubenswrapper[4902]: I1009 14:03:31.441175 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" Oct 09 14:03:31 crc kubenswrapper[4902]: I1009 14:03:31.514018 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-647744f6c-bzqqk" podStartSLOduration=3.626258173 podStartE2EDuration="17.514000882s" podCreationTimestamp="2025-10-09 14:03:14 +0000 UTC" firstStartedPulling="2025-10-09 14:03:15.489070215 +0000 UTC m=+742.686929279" lastFinishedPulling="2025-10-09 14:03:29.376812924 +0000 UTC m=+756.574671988" observedRunningTime="2025-10-09 14:03:31.473801903 +0000 UTC m=+758.671660987" watchObservedRunningTime="2025-10-09 14:03:31.514000882 +0000 UTC m=+758.711859946" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.981048 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8"] Oct 09 14:03:52 crc kubenswrapper[4902]: E1009 14:03:52.981894 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="extract-utilities" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.981913 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="extract-utilities" Oct 09 14:03:52 crc kubenswrapper[4902]: E1009 14:03:52.981927 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="extract-content" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.981935 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="extract-content" Oct 09 14:03:52 crc kubenswrapper[4902]: E1009 14:03:52.981948 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="registry-server" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.981959 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="registry-server" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.982093 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfc96e2-757c-4c2c-aa02-ba62993dc45d" containerName="registry-server" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.982858 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:03:52 crc kubenswrapper[4902]: W1009 14:03:52.985308 4902 reflector.go:561] object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5vc5b": failed to list *v1.Secret: secrets "barbican-operator-controller-manager-dockercfg-5vc5b" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Oct 09 14:03:52 crc kubenswrapper[4902]: E1009 14:03:52.985357 4902 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"barbican-operator-controller-manager-dockercfg-5vc5b\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-operator-controller-manager-dockercfg-5vc5b\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.997909 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4"] Oct 09 14:03:52 crc kubenswrapper[4902]: I1009 14:03:52.998934 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.004039 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mrhb9" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.005395 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.009569 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.010525 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.014463 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-fq62c" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.025435 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.029929 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.049156 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.050217 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.053983 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.055035 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.058860 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4rlb4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.059230 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vm5tl" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.071882 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g95gv\" (UniqueName: \"kubernetes.io/projected/ac925db8-cb97-468e-b43f-b219deb78cf6-kube-api-access-g95gv\") pod \"barbican-operator-controller-manager-64f84fcdbb-r4nn8\" (UID: \"ac925db8-cb97-468e-b43f-b219deb78cf6\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.071937 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7zr\" (UniqueName: \"kubernetes.io/projected/960aab4c-ce86-4753-b848-3367f15d962c-kube-api-access-ht7zr\") pod \"heat-operator-controller-manager-6d9967f8dd-7bjrw\" (UID: \"960aab4c-ce86-4753-b848-3367f15d962c\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.071980 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/c27e1a63-1155-43eb-9c97-61680f083de0-kube-api-access-mzggf\") pod \"designate-operator-controller-manager-687df44cdb-pxsfz\" (UID: \"c27e1a63-1155-43eb-9c97-61680f083de0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.072031 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szb6l\" (UniqueName: \"kubernetes.io/projected/6169ab22-9b0b-4bb3-b840-b3eb92d22c0c-kube-api-access-szb6l\") pod \"glance-operator-controller-manager-7bb46cd7d-2nkqk\" (UID: \"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.072064 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp4gg\" (UniqueName: \"kubernetes.io/projected/a1fcc021-b92b-417d-b92c-4e66386e8502-kube-api-access-lp4gg\") pod \"cinder-operator-controller-manager-59cdc64769-hrgs4\" (UID: \"a1fcc021-b92b-417d-b92c-4e66386e8502\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.091157 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.105543 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.107355 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.112775 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.118059 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m6r69" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.144329 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.145857 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.148447 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.148670 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-89cr8" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173154 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szb6l\" (UniqueName: \"kubernetes.io/projected/6169ab22-9b0b-4bb3-b840-b3eb92d22c0c-kube-api-access-szb6l\") pod \"glance-operator-controller-manager-7bb46cd7d-2nkqk\" (UID: \"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173213 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp4gg\" (UniqueName: \"kubernetes.io/projected/a1fcc021-b92b-417d-b92c-4e66386e8502-kube-api-access-lp4gg\") pod \"cinder-operator-controller-manager-59cdc64769-hrgs4\" (UID: \"a1fcc021-b92b-417d-b92c-4e66386e8502\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173271 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dc8\" (UniqueName: \"kubernetes.io/projected/39e518d9-bffd-4421-bc8b-2b333654ff9e-kube-api-access-77dc8\") pod \"horizon-operator-controller-manager-6d74794d9b-phbbt\" (UID: \"39e518d9-bffd-4421-bc8b-2b333654ff9e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173301 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173341 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g95gv\" (UniqueName: \"kubernetes.io/projected/ac925db8-cb97-468e-b43f-b219deb78cf6-kube-api-access-g95gv\") pod \"barbican-operator-controller-manager-64f84fcdbb-r4nn8\" (UID: \"ac925db8-cb97-468e-b43f-b219deb78cf6\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173375 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7zr\" (UniqueName: \"kubernetes.io/projected/960aab4c-ce86-4753-b848-3367f15d962c-kube-api-access-ht7zr\") pod \"heat-operator-controller-manager-6d9967f8dd-7bjrw\" (UID: \"960aab4c-ce86-4753-b848-3367f15d962c\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173440 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfl8p\" (UniqueName: \"kubernetes.io/projected/92649c6e-71ba-4945-9210-19394d180222-kube-api-access-hfl8p\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.173471 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/c27e1a63-1155-43eb-9c97-61680f083de0-kube-api-access-mzggf\") pod \"designate-operator-controller-manager-687df44cdb-pxsfz\" (UID: \"c27e1a63-1155-43eb-9c97-61680f083de0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.202647 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.211401 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7zr\" (UniqueName: \"kubernetes.io/projected/960aab4c-ce86-4753-b848-3367f15d962c-kube-api-access-ht7zr\") pod \"heat-operator-controller-manager-6d9967f8dd-7bjrw\" (UID: \"960aab4c-ce86-4753-b848-3367f15d962c\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.211514 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.212797 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.218468 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.220134 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lczxc" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.223347 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szb6l\" (UniqueName: \"kubernetes.io/projected/6169ab22-9b0b-4bb3-b840-b3eb92d22c0c-kube-api-access-szb6l\") pod \"glance-operator-controller-manager-7bb46cd7d-2nkqk\" (UID: \"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.230121 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp4gg\" (UniqueName: \"kubernetes.io/projected/a1fcc021-b92b-417d-b92c-4e66386e8502-kube-api-access-lp4gg\") pod \"cinder-operator-controller-manager-59cdc64769-hrgs4\" (UID: \"a1fcc021-b92b-417d-b92c-4e66386e8502\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.232002 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzggf\" (UniqueName: \"kubernetes.io/projected/c27e1a63-1155-43eb-9c97-61680f083de0-kube-api-access-mzggf\") pod \"designate-operator-controller-manager-687df44cdb-pxsfz\" (UID: \"c27e1a63-1155-43eb-9c97-61680f083de0\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.237771 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g95gv\" (UniqueName: \"kubernetes.io/projected/ac925db8-cb97-468e-b43f-b219deb78cf6-kube-api-access-g95gv\") pod \"barbican-operator-controller-manager-64f84fcdbb-r4nn8\" (UID: \"ac925db8-cb97-468e-b43f-b219deb78cf6\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.237786 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.239978 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.258375 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vvqwp" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.269991 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.278238 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77dc8\" (UniqueName: \"kubernetes.io/projected/39e518d9-bffd-4421-bc8b-2b333654ff9e-kube-api-access-77dc8\") pod \"horizon-operator-controller-manager-6d74794d9b-phbbt\" (UID: \"39e518d9-bffd-4421-bc8b-2b333654ff9e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.278293 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.278362 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfl8p\" (UniqueName: \"kubernetes.io/projected/92649c6e-71ba-4945-9210-19394d180222-kube-api-access-hfl8p\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: E1009 14:03:53.278863 4902 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 14:03:53 crc kubenswrapper[4902]: E1009 14:03:53.278972 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert podName:92649c6e-71ba-4945-9210-19394d180222 nodeName:}" failed. No retries permitted until 2025-10-09 14:03:53.77894211 +0000 UTC m=+780.976801174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert") pod "infra-operator-controller-manager-585fc5b659-6xw4k" (UID: "92649c6e-71ba-4945-9210-19394d180222") : secret "infra-operator-webhook-server-cert" not found Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.282289 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.299762 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfl8p\" (UniqueName: \"kubernetes.io/projected/92649c6e-71ba-4945-9210-19394d180222-kube-api-access-hfl8p\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.299891 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.302325 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.307632 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dc8\" (UniqueName: \"kubernetes.io/projected/39e518d9-bffd-4421-bc8b-2b333654ff9e-kube-api-access-77dc8\") pod \"horizon-operator-controller-manager-6d74794d9b-phbbt\" (UID: \"39e518d9-bffd-4421-bc8b-2b333654ff9e\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.318629 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6pflq" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.333982 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.357060 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.375208 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.380516 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdtn\" (UniqueName: \"kubernetes.io/projected/dad70f9e-3fb4-41ff-95f4-dc6be5277aa0-kube-api-access-7kdtn\") pod \"ironic-operator-controller-manager-74cb5cbc49-ql5w7\" (UID: \"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.381073 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x9q\" (UniqueName: \"kubernetes.io/projected/d11828c7-488d-414a-a024-68a46fca78e1-kube-api-access-g6x9q\") pod \"keystone-operator-controller-manager-ddb98f99b-7bzzg\" (UID: \"d11828c7-488d-414a-a024-68a46fca78e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.386558 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.387790 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.393343 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.394826 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.396436 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.398921 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vh4k9" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.402449 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fxw2l" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.409499 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.422963 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.424132 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.425940 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7nt4t" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.429402 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.484051 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.484484 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-m6r69" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.489857 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495226 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt9j8\" (UniqueName: \"kubernetes.io/projected/4ad4e07d-4f69-4f1f-9886-bde91ec3b735-kube-api-access-nt9j8\") pod \"manila-operator-controller-manager-59578bc799-kvzf6\" (UID: \"4ad4e07d-4f69-4f1f-9886-bde91ec3b735\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495300 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6l7h\" (UniqueName: \"kubernetes.io/projected/07f5d370-e69d-41f8-b65a-d25dc8b38de8-kube-api-access-m6l7h\") pod \"mariadb-operator-controller-manager-5777b4f897-hxrth\" (UID: \"07f5d370-e69d-41f8-b65a-d25dc8b38de8\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495333 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lt7\" (UniqueName: \"kubernetes.io/projected/5be11d13-4feb-4a12-9f9b-69a99d2fa5a4-kube-api-access-s5lt7\") pod \"nova-operator-controller-manager-57bb74c7bf-s7bkj\" (UID: \"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495381 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x9q\" (UniqueName: \"kubernetes.io/projected/d11828c7-488d-414a-a024-68a46fca78e1-kube-api-access-g6x9q\") pod \"keystone-operator-controller-manager-ddb98f99b-7bzzg\" (UID: \"d11828c7-488d-414a-a024-68a46fca78e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495566 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4wfd\" (UniqueName: \"kubernetes.io/projected/0dafd5d3-f605-4c75-86ca-8d40831e9cb7-kube-api-access-l4wfd\") pod \"neutron-operator-controller-manager-797d478b46-ld6xx\" (UID: \"0dafd5d3-f605-4c75-86ca-8d40831e9cb7\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.495650 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdtn\" (UniqueName: \"kubernetes.io/projected/dad70f9e-3fb4-41ff-95f4-dc6be5277aa0-kube-api-access-7kdtn\") pod \"ironic-operator-controller-manager-74cb5cbc49-ql5w7\" (UID: \"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.503358 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.511589 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.530853 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-b579g" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.534785 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdtn\" (UniqueName: \"kubernetes.io/projected/dad70f9e-3fb4-41ff-95f4-dc6be5277aa0-kube-api-access-7kdtn\") pod \"ironic-operator-controller-manager-74cb5cbc49-ql5w7\" (UID: \"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.540402 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x9q\" (UniqueName: \"kubernetes.io/projected/d11828c7-488d-414a-a024-68a46fca78e1-kube-api-access-g6x9q\") pod \"keystone-operator-controller-manager-ddb98f99b-7bzzg\" (UID: \"d11828c7-488d-414a-a024-68a46fca78e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.571717 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.571774 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.604236 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6l7h\" (UniqueName: \"kubernetes.io/projected/07f5d370-e69d-41f8-b65a-d25dc8b38de8-kube-api-access-m6l7h\") pod \"mariadb-operator-controller-manager-5777b4f897-hxrth\" (UID: \"07f5d370-e69d-41f8-b65a-d25dc8b38de8\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.604627 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lt7\" (UniqueName: \"kubernetes.io/projected/5be11d13-4feb-4a12-9f9b-69a99d2fa5a4-kube-api-access-s5lt7\") pod \"nova-operator-controller-manager-57bb74c7bf-s7bkj\" (UID: \"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.604697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgh8p\" (UniqueName: \"kubernetes.io/projected/cad34d91-d544-4311-a9b3-adb11e4217c0-kube-api-access-jgh8p\") pod \"octavia-operator-controller-manager-6d7c7ddf95-mqn8s\" (UID: \"cad34d91-d544-4311-a9b3-adb11e4217c0\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.604739 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4wfd\" (UniqueName: \"kubernetes.io/projected/0dafd5d3-f605-4c75-86ca-8d40831e9cb7-kube-api-access-l4wfd\") pod \"neutron-operator-controller-manager-797d478b46-ld6xx\" (UID: \"0dafd5d3-f605-4c75-86ca-8d40831e9cb7\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.604811 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt9j8\" (UniqueName: \"kubernetes.io/projected/4ad4e07d-4f69-4f1f-9886-bde91ec3b735-kube-api-access-nt9j8\") pod \"manila-operator-controller-manager-59578bc799-kvzf6\" (UID: \"4ad4e07d-4f69-4f1f-9886-bde91ec3b735\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.605779 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lczxc" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.613644 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.649536 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4wfd\" (UniqueName: \"kubernetes.io/projected/0dafd5d3-f605-4c75-86ca-8d40831e9cb7-kube-api-access-l4wfd\") pod \"neutron-operator-controller-manager-797d478b46-ld6xx\" (UID: \"0dafd5d3-f605-4c75-86ca-8d40831e9cb7\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.649545 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.652603 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt9j8\" (UniqueName: \"kubernetes.io/projected/4ad4e07d-4f69-4f1f-9886-bde91ec3b735-kube-api-access-nt9j8\") pod \"manila-operator-controller-manager-59578bc799-kvzf6\" (UID: \"4ad4e07d-4f69-4f1f-9886-bde91ec3b735\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.654968 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6l7h\" (UniqueName: \"kubernetes.io/projected/07f5d370-e69d-41f8-b65a-d25dc8b38de8-kube-api-access-m6l7h\") pod \"mariadb-operator-controller-manager-5777b4f897-hxrth\" (UID: \"07f5d370-e69d-41f8-b65a-d25dc8b38de8\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.656996 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lt7\" (UniqueName: \"kubernetes.io/projected/5be11d13-4feb-4a12-9f9b-69a99d2fa5a4-kube-api-access-s5lt7\") pod \"nova-operator-controller-manager-57bb74c7bf-s7bkj\" (UID: \"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.681087 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.682249 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.683172 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.683595 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.683782 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.686942 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fcfnc" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.692193 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.700714 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.702963 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cwt55" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.703056 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.703206 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.703333 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6k9qj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706106 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2jrj\" (UniqueName: \"kubernetes.io/projected/d5375300-657d-4e1d-92af-2107cbc7972f-kube-api-access-h2jrj\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706162 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpljx\" (UniqueName: \"kubernetes.io/projected/d972447c-10cf-4d4b-870d-11e79f6bd98a-kube-api-access-gpljx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-xlwpm\" (UID: \"d972447c-10cf-4d4b-870d-11e79f6bd98a\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706203 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkpl\" (UniqueName: \"kubernetes.io/projected/52ec0675-fab2-43fd-a447-8896de9e78fd-kube-api-access-fwkpl\") pod \"placement-operator-controller-manager-664664cb68-s6zmh\" (UID: \"52ec0675-fab2-43fd-a447-8896de9e78fd\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706227 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crsw9\" (UniqueName: \"kubernetes.io/projected/3adf1a7b-f2b7-4927-a026-55afe09bc5ab-kube-api-access-crsw9\") pod \"ovn-operator-controller-manager-869cc7797f-dwgtb\" (UID: \"3adf1a7b-f2b7-4927-a026-55afe09bc5ab\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706261 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.706284 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgh8p\" (UniqueName: \"kubernetes.io/projected/cad34d91-d544-4311-a9b3-adb11e4217c0-kube-api-access-jgh8p\") pod \"octavia-operator-controller-manager-6d7c7ddf95-mqn8s\" (UID: \"cad34d91-d544-4311-a9b3-adb11e4217c0\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.708772 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-st9gw" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.720443 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.742219 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.767137 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.788443 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgh8p\" (UniqueName: \"kubernetes.io/projected/cad34d91-d544-4311-a9b3-adb11e4217c0-kube-api-access-jgh8p\") pod \"octavia-operator-controller-manager-6d7c7ddf95-mqn8s\" (UID: \"cad34d91-d544-4311-a9b3-adb11e4217c0\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.788474 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.789776 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.793298 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6pflq" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.793570 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vvqwp" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.794041 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-j5nfm" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.795511 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.798824 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.803383 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vh4k9" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809580 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpljx\" (UniqueName: \"kubernetes.io/projected/d972447c-10cf-4d4b-870d-11e79f6bd98a-kube-api-access-gpljx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-xlwpm\" (UID: \"d972447c-10cf-4d4b-870d-11e79f6bd98a\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809635 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809684 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwkpl\" (UniqueName: \"kubernetes.io/projected/52ec0675-fab2-43fd-a447-8896de9e78fd-kube-api-access-fwkpl\") pod \"placement-operator-controller-manager-664664cb68-s6zmh\" (UID: \"52ec0675-fab2-43fd-a447-8896de9e78fd\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809721 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crsw9\" (UniqueName: \"kubernetes.io/projected/3adf1a7b-f2b7-4927-a026-55afe09bc5ab-kube-api-access-crsw9\") pod \"ovn-operator-controller-manager-869cc7797f-dwgtb\" (UID: \"3adf1a7b-f2b7-4927-a026-55afe09bc5ab\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809788 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.809851 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2jrj\" (UniqueName: \"kubernetes.io/projected/d5375300-657d-4e1d-92af-2107cbc7972f-kube-api-access-h2jrj\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.815786 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:03:53 crc kubenswrapper[4902]: E1009 14:03:53.816148 4902 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 14:03:53 crc kubenswrapper[4902]: E1009 14:03:53.816207 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert podName:d5375300-657d-4e1d-92af-2107cbc7972f nodeName:}" failed. No retries permitted until 2025-10-09 14:03:54.316190444 +0000 UTC m=+781.514049508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" (UID: "d5375300-657d-4e1d-92af-2107cbc7972f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.829091 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fxw2l" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.832539 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.836158 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.837401 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92649c6e-71ba-4945-9210-19394d180222-cert\") pod \"infra-operator-controller-manager-585fc5b659-6xw4k\" (UID: \"92649c6e-71ba-4945-9210-19394d180222\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.850863 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpljx\" (UniqueName: \"kubernetes.io/projected/d972447c-10cf-4d4b-870d-11e79f6bd98a-kube-api-access-gpljx\") pod \"swift-operator-controller-manager-5f4d5dfdc6-xlwpm\" (UID: \"d972447c-10cf-4d4b-870d-11e79f6bd98a\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.859276 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwkpl\" (UniqueName: \"kubernetes.io/projected/52ec0675-fab2-43fd-a447-8896de9e78fd-kube-api-access-fwkpl\") pod \"placement-operator-controller-manager-664664cb68-s6zmh\" (UID: \"52ec0675-fab2-43fd-a447-8896de9e78fd\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.861018 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crsw9\" (UniqueName: \"kubernetes.io/projected/3adf1a7b-f2b7-4927-a026-55afe09bc5ab-kube-api-access-crsw9\") pod \"ovn-operator-controller-manager-869cc7797f-dwgtb\" (UID: \"3adf1a7b-f2b7-4927-a026-55afe09bc5ab\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.861727 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2jrj\" (UniqueName: \"kubernetes.io/projected/d5375300-657d-4e1d-92af-2107cbc7972f-kube-api-access-h2jrj\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.867022 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.891555 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.893141 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.893236 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.894541 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.904905 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x8jzd" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.911331 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nmk\" (UniqueName: \"kubernetes.io/projected/55efad12-5eb1-4c57-bb2f-700ead538209-kube-api-access-94nmk\") pod \"telemetry-operator-controller-manager-578874c84d-hn9gv\" (UID: \"55efad12-5eb1-4c57-bb2f-700ead538209\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.912968 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.921955 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.933210 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.936852 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gcnvz" Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.990352 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z"] Oct 09 14:03:53 crc kubenswrapper[4902]: I1009 14:03:53.998073 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.003125 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.013205 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grlj\" (UniqueName: \"kubernetes.io/projected/183bbfe9-141b-4b7a-adc1-3ea01011ebd7-kube-api-access-6grlj\") pod \"test-operator-controller-manager-ffcdd6c94-7lct5\" (UID: \"183bbfe9-141b-4b7a-adc1-3ea01011ebd7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.013279 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nmk\" (UniqueName: \"kubernetes.io/projected/55efad12-5eb1-4c57-bb2f-700ead538209-kube-api-access-94nmk\") pod \"telemetry-operator-controller-manager-578874c84d-hn9gv\" (UID: \"55efad12-5eb1-4c57-bb2f-700ead538209\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.019707 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.022678 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.024837 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5hjq2" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.053780 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.056083 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nmk\" (UniqueName: \"kubernetes.io/projected/55efad12-5eb1-4c57-bb2f-700ead538209-kube-api-access-94nmk\") pod \"telemetry-operator-controller-manager-578874c84d-hn9gv\" (UID: \"55efad12-5eb1-4c57-bb2f-700ead538209\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.076505 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.078267 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.081809 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-lpmq5" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.089711 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-89cr8" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.090532 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.097501 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.114604 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qj8\" (UniqueName: \"kubernetes.io/projected/be1ef3d2-4c04-4040-9d73-80655f4b9dbb-kube-api-access-r8qj8\") pod \"watcher-operator-controller-manager-646675d848-9lx7z\" (UID: \"be1ef3d2-4c04-4040-9d73-80655f4b9dbb\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.114672 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grlj\" (UniqueName: \"kubernetes.io/projected/183bbfe9-141b-4b7a-adc1-3ea01011ebd7-kube-api-access-6grlj\") pod \"test-operator-controller-manager-ffcdd6c94-7lct5\" (UID: \"183bbfe9-141b-4b7a-adc1-3ea01011ebd7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.127627 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.129655 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5vc5b" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.135543 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grlj\" (UniqueName: \"kubernetes.io/projected/183bbfe9-141b-4b7a-adc1-3ea01011ebd7-kube-api-access-6grlj\") pod \"test-operator-controller-manager-ffcdd6c94-7lct5\" (UID: \"183bbfe9-141b-4b7a-adc1-3ea01011ebd7\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.141737 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.147135 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.184991 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.222377 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5p2\" (UniqueName: \"kubernetes.io/projected/93ae4a6d-1e42-4e45-8128-61088861873e-kube-api-access-hd5p2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l\" (UID: \"93ae4a6d-1e42-4e45-8128-61088861873e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.222687 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6vc\" (UniqueName: \"kubernetes.io/projected/cf6f1e72-9e96-4905-a7f4-d88ec796724e-kube-api-access-9h6vc\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.222803 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.222925 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qj8\" (UniqueName: \"kubernetes.io/projected/be1ef3d2-4c04-4040-9d73-80655f4b9dbb-kube-api-access-r8qj8\") pod \"watcher-operator-controller-manager-646675d848-9lx7z\" (UID: \"be1ef3d2-4c04-4040-9d73-80655f4b9dbb\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.255387 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qj8\" (UniqueName: \"kubernetes.io/projected/be1ef3d2-4c04-4040-9d73-80655f4b9dbb-kube-api-access-r8qj8\") pod \"watcher-operator-controller-manager-646675d848-9lx7z\" (UID: \"be1ef3d2-4c04-4040-9d73-80655f4b9dbb\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.293690 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.302431 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.323878 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5p2\" (UniqueName: \"kubernetes.io/projected/93ae4a6d-1e42-4e45-8128-61088861873e-kube-api-access-hd5p2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l\" (UID: \"93ae4a6d-1e42-4e45-8128-61088861873e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.323941 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h6vc\" (UniqueName: \"kubernetes.io/projected/cf6f1e72-9e96-4905-a7f4-d88ec796724e-kube-api-access-9h6vc\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.323989 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.324069 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:54 crc kubenswrapper[4902]: E1009 14:03:54.324877 4902 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 09 14:03:54 crc kubenswrapper[4902]: E1009 14:03:54.324933 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert podName:cf6f1e72-9e96-4905-a7f4-d88ec796724e nodeName:}" failed. No retries permitted until 2025-10-09 14:03:54.824915996 +0000 UTC m=+782.022775060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert") pod "openstack-operator-controller-manager-d5f574b49-xxs9l" (UID: "cf6f1e72-9e96-4905-a7f4-d88ec796724e") : secret "webhook-server-cert" not found Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.354514 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h6vc\" (UniqueName: \"kubernetes.io/projected/cf6f1e72-9e96-4905-a7f4-d88ec796724e-kube-api-access-9h6vc\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.359278 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5375300-657d-4e1d-92af-2107cbc7972f-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757db22ds\" (UID: \"d5375300-657d-4e1d-92af-2107cbc7972f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.366876 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5p2\" (UniqueName: \"kubernetes.io/projected/93ae4a6d-1e42-4e45-8128-61088861873e-kube-api-access-hd5p2\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l\" (UID: \"93ae4a6d-1e42-4e45-8128-61088861873e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.455397 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.520100 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.576136 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.640209 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" event={"ID":"a1fcc021-b92b-417d-b92c-4e66386e8502","Type":"ContainerStarted","Data":"4b95f84b8b901adba1f294007f1d88c12527a1d6c221c5d783c255efabbd6d14"} Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.651561 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" event={"ID":"c27e1a63-1155-43eb-9c97-61680f083de0","Type":"ContainerStarted","Data":"ce178f4597c7174681f008482225fac73059d673ea5fc0f8e79a58f3cc04b36f"} Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.711523 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.727961 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt"] Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.736042 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw"] Oct 09 14:03:54 crc kubenswrapper[4902]: W1009 14:03:54.780645 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod960aab4c_ce86_4753_b848_3367f15d962c.slice/crio-b769500f31372dfc3ee4c3fe49d421c02e3eac8ab1236bc4c75e9ab11d14c2ca WatchSource:0}: Error finding container b769500f31372dfc3ee4c3fe49d421c02e3eac8ab1236bc4c75e9ab11d14c2ca: Status 404 returned error can't find the container with id b769500f31372dfc3ee4c3fe49d421c02e3eac8ab1236bc4c75e9ab11d14c2ca Oct 09 14:03:54 crc kubenswrapper[4902]: W1009 14:03:54.788181 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6169ab22_9b0b_4bb3_b840_b3eb92d22c0c.slice/crio-a68d5808962a6fd45004394cc26a04f0529ae1158d4a508e0245942babd01bdc WatchSource:0}: Error finding container a68d5808962a6fd45004394cc26a04f0529ae1158d4a508e0245942babd01bdc: Status 404 returned error can't find the container with id a68d5808962a6fd45004394cc26a04f0529ae1158d4a508e0245942babd01bdc Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.841840 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:54 crc kubenswrapper[4902]: I1009 14:03:54.849530 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf6f1e72-9e96-4905-a7f4-d88ec796724e-cert\") pod \"openstack-operator-controller-manager-d5f574b49-xxs9l\" (UID: \"cf6f1e72-9e96-4905-a7f4-d88ec796724e\") " pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.059302 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7"] Oct 09 14:03:55 crc kubenswrapper[4902]: W1009 14:03:55.069639 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad70f9e_3fb4_41ff_95f4_dc6be5277aa0.slice/crio-9aaa5ed304b56d0f4e3a8a18fb2e4b49f9f0ac408dde25d8c0b5d9cd56e4984a WatchSource:0}: Error finding container 9aaa5ed304b56d0f4e3a8a18fb2e4b49f9f0ac408dde25d8c0b5d9cd56e4984a: Status 404 returned error can't find the container with id 9aaa5ed304b56d0f4e3a8a18fb2e4b49f9f0ac408dde25d8c0b5d9cd56e4984a Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.115116 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.388484 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.427404 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5"] Oct 09 14:03:55 crc kubenswrapper[4902]: W1009 14:03:55.450998 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad34d91_d544_4311_a9b3_adb11e4217c0.slice/crio-6a1f01065b75311f06be9457af65059afe16c7a74ed24de63067fa5d4c891bd0 WatchSource:0}: Error finding container 6a1f01065b75311f06be9457af65059afe16c7a74ed24de63067fa5d4c891bd0: Status 404 returned error can't find the container with id 6a1f01065b75311f06be9457af65059afe16c7a74ed24de63067fa5d4c891bd0 Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.490058 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.539091 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.748128 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" event={"ID":"07f5d370-e69d-41f8-b65a-d25dc8b38de8","Type":"ContainerStarted","Data":"a9c4f3fd4edd784d7c8ee5c5e08e027569d128eb00218e3cd61e0bf940fb3134"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749103 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749140 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" event={"ID":"52ec0675-fab2-43fd-a447-8896de9e78fd","Type":"ContainerStarted","Data":"a6ab66a7d8ca0c1d139af8f8f2f218a6f90e8f6304519904d8748b3c9b0b61e6"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749155 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749166 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749179 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749188 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749357 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749372 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749382 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" event={"ID":"d972447c-10cf-4d4b-870d-11e79f6bd98a","Type":"ContainerStarted","Data":"9acec0bf6534770b5e77b5845b4978600f4212e213e22401124d9082c606d530"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749461 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" event={"ID":"0dafd5d3-f605-4c75-86ca-8d40831e9cb7","Type":"ContainerStarted","Data":"2d76aa39b2b4f0f13fd44747addea3dd6ba1383e50321ed0be843f0c96b280fe"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.749516 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" event={"ID":"183bbfe9-141b-4b7a-adc1-3ea01011ebd7","Type":"ContainerStarted","Data":"133772fae594c7d098a47048a40cf003a798d6b8c3bcb725d121d97ac6ddd3af"} Oct 09 14:03:55 crc kubenswrapper[4902]: W1009 14:03:55.748322 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ae4a6d_1e42_4e45_8128_61088861873e.slice/crio-3c3c3298f579799b184399ed83b7a8d3491d49a6641e2c9503dac9c2ada003db WatchSource:0}: Error finding container 3c3c3298f579799b184399ed83b7a8d3491d49a6641e2c9503dac9c2ada003db: Status 404 returned error can't find the container with id 3c3c3298f579799b184399ed83b7a8d3491d49a6641e2c9503dac9c2ada003db Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.749034 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-crsw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-869cc7797f-dwgtb_openstack-operators(3adf1a7b-f2b7-4927-a026-55afe09bc5ab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751719 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" event={"ID":"d11828c7-488d-414a-a024-68a46fca78e1","Type":"ContainerStarted","Data":"d477de2698ad0abdf9ef17a75ae35c5eedd2f570a7387c68f572869299f7c4d4"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751744 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" event={"ID":"4ad4e07d-4f69-4f1f-9886-bde91ec3b735","Type":"ContainerStarted","Data":"99933c4c4e1dbdd1dc81d614fdf36c23f411aa9b626bd7ff1d10f4e56af97b9a"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751757 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" event={"ID":"39e518d9-bffd-4421-bc8b-2b333654ff9e","Type":"ContainerStarted","Data":"6b829c14e98a4791239706554c1fff0b6097252fcbfc633a20d7440ca1d31707"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751956 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" event={"ID":"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0","Type":"ContainerStarted","Data":"9aaa5ed304b56d0f4e3a8a18fb2e4b49f9f0ac408dde25d8c0b5d9cd56e4984a"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751974 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" event={"ID":"960aab4c-ce86-4753-b848-3367f15d962c","Type":"ContainerStarted","Data":"b769500f31372dfc3ee4c3fe49d421c02e3eac8ab1236bc4c75e9ab11d14c2ca"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.751985 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" event={"ID":"cad34d91-d544-4311-a9b3-adb11e4217c0","Type":"ContainerStarted","Data":"6a1f01065b75311f06be9457af65059afe16c7a74ed24de63067fa5d4c891bd0"} Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.753762 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.754769 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" event={"ID":"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c","Type":"ContainerStarted","Data":"a68d5808962a6fd45004394cc26a04f0529ae1158d4a508e0245942babd01bdc"} Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.756265 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hd5p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l_openstack-operators(93ae4a6d-1e42-4e45-8128-61088861873e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.757479 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" podUID="93ae4a6d-1e42-4e45-8128-61088861873e" Oct 09 14:03:55 crc kubenswrapper[4902]: W1009 14:03:55.761192 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be11d13_4feb_4a12_9f9b_69a99d2fa5a4.slice/crio-f4df3333cb5fe41c0b1c32ace8da00d86ae90e94bb78a0afa2607219e9585884 WatchSource:0}: Error finding container f4df3333cb5fe41c0b1c32ace8da00d86ae90e94bb78a0afa2607219e9585884: Status 404 returned error can't find the container with id f4df3333cb5fe41c0b1c32ace8da00d86ae90e94bb78a0afa2607219e9585884 Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.765508 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k"] Oct 09 14:03:55 crc kubenswrapper[4902]: W1009 14:03:55.770967 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5375300_657d_4e1d_92af_2107cbc7972f.slice/crio-fd84aa42bac783be87a016e6f3fc270ad5cc816738782d65407f75190891d82a WatchSource:0}: Error finding container fd84aa42bac783be87a016e6f3fc270ad5cc816738782d65407f75190891d82a: Status 404 returned error can't find the container with id fd84aa42bac783be87a016e6f3fc270ad5cc816738782d65407f75190891d82a Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.779473 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.799096 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z"] Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.800550 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5lt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-57bb74c7bf-s7bkj_openstack-operators(5be11d13-4feb-4a12-9f9b-69a99d2fa5a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.801230 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r8qj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-646675d848-9lx7z_openstack-operators(be1ef3d2-4c04-4040-9d73-80655f4b9dbb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 14:03:55 crc kubenswrapper[4902]: E1009 14:03:55.802042 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2jrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6cc7fb757db22ds_openstack-operators(d5375300-657d-4e1d-92af-2107cbc7972f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.809037 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds"] Oct 09 14:03:55 crc kubenswrapper[4902]: I1009 14:03:55.814610 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l"] Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.023034 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" podUID="3adf1a7b-f2b7-4927-a026-55afe09bc5ab" Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.035952 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" podUID="be1ef3d2-4c04-4040-9d73-80655f4b9dbb" Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.050782 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" podUID="d5375300-657d-4e1d-92af-2107cbc7972f" Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.092981 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" podUID="5be11d13-4feb-4a12-9f9b-69a99d2fa5a4" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.773526 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" event={"ID":"92649c6e-71ba-4945-9210-19394d180222","Type":"ContainerStarted","Data":"1303b76d2582e823892b43d839bada769babd6c997498314288b47709f8dac28"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.779700 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" event={"ID":"93ae4a6d-1e42-4e45-8128-61088861873e","Type":"ContainerStarted","Data":"3c3c3298f579799b184399ed83b7a8d3491d49a6641e2c9503dac9c2ada003db"} Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.785286 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" podUID="93ae4a6d-1e42-4e45-8128-61088861873e" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.789125 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" event={"ID":"be1ef3d2-4c04-4040-9d73-80655f4b9dbb","Type":"ContainerStarted","Data":"ddc9def21b04e4d925d4f43c150368152abea9aea2d81c5afed92761bb45a602"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.789176 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" event={"ID":"be1ef3d2-4c04-4040-9d73-80655f4b9dbb","Type":"ContainerStarted","Data":"8a0d54ff5d4ac272ec55db1b600b561899ee31cedef2374974d5394d7f3e35ce"} Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.793549 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" podUID="be1ef3d2-4c04-4040-9d73-80655f4b9dbb" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.805258 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" event={"ID":"55efad12-5eb1-4c57-bb2f-700ead538209","Type":"ContainerStarted","Data":"94052fbcf1b6d166cd5139ab8a31e3209307cac7521fcf502b59d192c327fba3"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.817610 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" event={"ID":"cf6f1e72-9e96-4905-a7f4-d88ec796724e","Type":"ContainerStarted","Data":"4fbb84cb683ffa75c482574d366694ce055acc48d57e720eb590892bbb42931c"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.817718 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" event={"ID":"cf6f1e72-9e96-4905-a7f4-d88ec796724e","Type":"ContainerStarted","Data":"0d607cfdd8e74dbbc022d27846709dc873c5bafff9aa08ce5c3028b569babc1f"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.817771 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" event={"ID":"cf6f1e72-9e96-4905-a7f4-d88ec796724e","Type":"ContainerStarted","Data":"6a301c853d0108bf19c954cbe9907cdb377d56c637e57391b4e12c68f5a55cdf"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.818026 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.819496 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" event={"ID":"ac925db8-cb97-468e-b43f-b219deb78cf6","Type":"ContainerStarted","Data":"c2ebf43011f3fe36da274f5a4cc6c1ba653646c7554b17108bcef23444d0641f"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.825139 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" event={"ID":"3adf1a7b-f2b7-4927-a026-55afe09bc5ab","Type":"ContainerStarted","Data":"c2e266f3c55a29d545b06944be40faaa9252438691bcec2437819003d6233b37"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.825241 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" event={"ID":"3adf1a7b-f2b7-4927-a026-55afe09bc5ab","Type":"ContainerStarted","Data":"0b45a4192927c3eeea092dddc870047ccc422484699cf359a5ae22728c1ad5d0"} Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.829555 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" podUID="3adf1a7b-f2b7-4927-a026-55afe09bc5ab" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.846992 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" event={"ID":"d5375300-657d-4e1d-92af-2107cbc7972f","Type":"ContainerStarted","Data":"2eb3c8b525a80994d4503e1666fd6c3ec7bfb2497f91a5726544046186bed101"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.847058 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" event={"ID":"d5375300-657d-4e1d-92af-2107cbc7972f","Type":"ContainerStarted","Data":"fd84aa42bac783be87a016e6f3fc270ad5cc816738782d65407f75190891d82a"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.852523 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" podStartSLOduration=3.852497022 podStartE2EDuration="3.852497022s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:03:56.849921609 +0000 UTC m=+784.047780683" watchObservedRunningTime="2025-10-09 14:03:56.852497022 +0000 UTC m=+784.050356086" Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.861050 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" event={"ID":"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4","Type":"ContainerStarted","Data":"d25a221685b5f4f895dcc3c564cdf2bbb02e7b4efe4a97ce2358c77f133f158b"} Oct 09 14:03:56 crc kubenswrapper[4902]: I1009 14:03:56.861115 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" event={"ID":"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4","Type":"ContainerStarted","Data":"f4df3333cb5fe41c0b1c32ace8da00d86ae90e94bb78a0afa2607219e9585884"} Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.861450 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" podUID="d5375300-657d-4e1d-92af-2107cbc7972f" Oct 09 14:03:56 crc kubenswrapper[4902]: E1009 14:03:56.862669 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" podUID="5be11d13-4feb-4a12-9f9b-69a99d2fa5a4" Oct 09 14:03:57 crc kubenswrapper[4902]: E1009 14:03:57.892720 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" podUID="d5375300-657d-4e1d-92af-2107cbc7972f" Oct 09 14:03:57 crc kubenswrapper[4902]: E1009 14:03:57.893394 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" podUID="be1ef3d2-4c04-4040-9d73-80655f4b9dbb" Oct 09 14:03:57 crc kubenswrapper[4902]: E1009 14:03:57.893463 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:315e558023b41ac1aa215082096995a03810c5b42910a33b00427ffcac9c6a14\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" podUID="3adf1a7b-f2b7-4927-a026-55afe09bc5ab" Oct 09 14:03:57 crc kubenswrapper[4902]: E1009 14:03:57.893526 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" podUID="93ae4a6d-1e42-4e45-8128-61088861873e" Oct 09 14:03:57 crc kubenswrapper[4902]: E1009 14:03:57.893531 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" podUID="5be11d13-4feb-4a12-9f9b-69a99d2fa5a4" Oct 09 14:04:05 crc kubenswrapper[4902]: I1009 14:04:05.121662 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-d5f574b49-xxs9l" Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.967599 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" event={"ID":"55efad12-5eb1-4c57-bb2f-700ead538209","Type":"ContainerStarted","Data":"1f806c3f5f78a97aa132870e953156c186ebb281b6bca3ad9178b7fe43174f1f"} Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.971138 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" event={"ID":"4ad4e07d-4f69-4f1f-9886-bde91ec3b735","Type":"ContainerStarted","Data":"5d373fbb9720752ccecf9a92922f0954c3b5281612fe4e3c65ff43c00a7484dc"} Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.972132 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" event={"ID":"a1fcc021-b92b-417d-b92c-4e66386e8502","Type":"ContainerStarted","Data":"59784f654f5040140f04b69db11bc5a1c9fd526859c768c8964c9592c2c3b783"} Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.976100 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" event={"ID":"d972447c-10cf-4d4b-870d-11e79f6bd98a","Type":"ContainerStarted","Data":"9e7b1e20baa3269550e931a95d39cc92880788699243eaaceb9b0742bdedfb40"} Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.976159 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" event={"ID":"d972447c-10cf-4d4b-870d-11e79f6bd98a","Type":"ContainerStarted","Data":"9298573d8347be35a41ceca4bc0f213b2d62179b152534b9d81eab55fe680e54"} Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.977328 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:04:07 crc kubenswrapper[4902]: I1009 14:04:07.993718 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" event={"ID":"39e518d9-bffd-4421-bc8b-2b333654ff9e","Type":"ContainerStarted","Data":"c01c0bb3cc92a92e6f9ac2f4bcbf7185005f88e495d96ff5f4351577c06e5dc9"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.010840 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" event={"ID":"0dafd5d3-f605-4c75-86ca-8d40831e9cb7","Type":"ContainerStarted","Data":"761072f1e30c365c63cb82e41efef7facb040073771382ce4338255bfb25ec30"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.037737 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" event={"ID":"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c","Type":"ContainerStarted","Data":"daf92ef2d17537eaaae823bacec7672a2e8ef5659c7fce9b7c03c427a5b02fe4"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.051658 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" event={"ID":"ac925db8-cb97-468e-b43f-b219deb78cf6","Type":"ContainerStarted","Data":"e2ae2fe9daefb1819c15d5c44a961bdecffcf6f545c7fda5948e18b3054ee484"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.068596 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" event={"ID":"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0","Type":"ContainerStarted","Data":"773a3c062853adafe63200c07c0ce98b5f7b7da4228b964443cd87c2ea82b03e"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.110164 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" event={"ID":"d11828c7-488d-414a-a024-68a46fca78e1","Type":"ContainerStarted","Data":"cb4be5ec0131538f59780df3397a9f52532e1b8f7c796c613b9ad44fff0e0b72"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.125461 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" event={"ID":"52ec0675-fab2-43fd-a447-8896de9e78fd","Type":"ContainerStarted","Data":"e20fcdf4159379d9d492f4f87f52b95ce1e4625a8efa8b36bbe140c4766172eb"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.150831 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" event={"ID":"cad34d91-d544-4311-a9b3-adb11e4217c0","Type":"ContainerStarted","Data":"a6a5f054d222886beab12d16527d080225102ed6499e704205c7b23b4edbfa9a"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.196585 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" event={"ID":"c27e1a63-1155-43eb-9c97-61680f083de0","Type":"ContainerStarted","Data":"70d55db1ededae3da62f64b0c09f718485c147eaec90b341531c20787d0d077f"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.213758 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" event={"ID":"960aab4c-ce86-4753-b848-3367f15d962c","Type":"ContainerStarted","Data":"fdebe92360a7e22ce725215ecc787885bf8b9f6db579a2cfcdfa39ea8850436a"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.261196 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" event={"ID":"92649c6e-71ba-4945-9210-19394d180222","Type":"ContainerStarted","Data":"b9da1177db1ed32212087ffecbf51dedf694c70214fd38654d3db98b07ed8f1f"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.262537 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.285047 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" event={"ID":"183bbfe9-141b-4b7a-adc1-3ea01011ebd7","Type":"ContainerStarted","Data":"5a8427926b40c75234a8b023b8e77203e0200cd55c8f34630332a565c8c4658a"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.286435 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.328358 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" event={"ID":"07f5d370-e69d-41f8-b65a-d25dc8b38de8","Type":"ContainerStarted","Data":"ef060c9bd0c9ede4baebd25ef91683b511a663e4ab3bb08800101e48282c2935"} Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.390511 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" podStartSLOduration=4.255971121 podStartE2EDuration="15.390491992s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.570486391 +0000 UTC m=+782.768345455" lastFinishedPulling="2025-10-09 14:04:06.705007262 +0000 UTC m=+793.902866326" observedRunningTime="2025-10-09 14:04:08.109977092 +0000 UTC m=+795.307836156" watchObservedRunningTime="2025-10-09 14:04:08.390491992 +0000 UTC m=+795.588351056" Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.449760 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" podStartSLOduration=4.302657009 podStartE2EDuration="15.449715957s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.57115241 +0000 UTC m=+782.769011474" lastFinishedPulling="2025-10-09 14:04:06.718211358 +0000 UTC m=+793.916070422" observedRunningTime="2025-10-09 14:04:08.439188398 +0000 UTC m=+795.637047462" watchObservedRunningTime="2025-10-09 14:04:08.449715957 +0000 UTC m=+795.647575021" Oct 09 14:04:08 crc kubenswrapper[4902]: I1009 14:04:08.451913 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" podStartSLOduration=4.505489799 podStartE2EDuration="15.451894719s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.799062483 +0000 UTC m=+782.996921547" lastFinishedPulling="2025-10-09 14:04:06.745467403 +0000 UTC m=+793.943326467" observedRunningTime="2025-10-09 14:04:08.389777752 +0000 UTC m=+795.587636816" watchObservedRunningTime="2025-10-09 14:04:08.451894719 +0000 UTC m=+795.649753793" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.343112 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" event={"ID":"92649c6e-71ba-4945-9210-19394d180222","Type":"ContainerStarted","Data":"86ac2ca4b5a28c27bf761928286efe01188b90feccff889948691d396b52addb"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.350655 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" event={"ID":"39e518d9-bffd-4421-bc8b-2b333654ff9e","Type":"ContainerStarted","Data":"b903a7787084c2e432a6a127661e3d80b29f69532ab4fa1157915dab12ada817"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.351484 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.354218 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" event={"ID":"cad34d91-d544-4311-a9b3-adb11e4217c0","Type":"ContainerStarted","Data":"b21f29d8229cb08fa7083f3192d99a56926c14e4085dccd62d841a5311486efe"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.354710 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.357303 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" event={"ID":"6169ab22-9b0b-4bb3-b840-b3eb92d22c0c","Type":"ContainerStarted","Data":"4e38096d8b5ac35611da63ac62683360d9c6383336926fae8edf67430c356feb"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.357709 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.359421 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" event={"ID":"c27e1a63-1155-43eb-9c97-61680f083de0","Type":"ContainerStarted","Data":"9ecdeda3584f437db7c07aec67188f6224080193dc0a05f9a19f5730f50b6a42"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.359775 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.361335 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" event={"ID":"a1fcc021-b92b-417d-b92c-4e66386e8502","Type":"ContainerStarted","Data":"caf7de1b5868e4304c9a3a9c070ca3e896547693d36c0b95eb77cf80aa135d9a"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.363930 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.371295 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" event={"ID":"183bbfe9-141b-4b7a-adc1-3ea01011ebd7","Type":"ContainerStarted","Data":"5cd45cde263f3549950d040fcd23236962f4d5e6c3ef624db283bce8bfd79177"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.373853 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" podStartSLOduration=4.475184476 podStartE2EDuration="16.373830226s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:54.792181989 +0000 UTC m=+781.990041053" lastFinishedPulling="2025-10-09 14:04:06.690827749 +0000 UTC m=+793.888686803" observedRunningTime="2025-10-09 14:04:09.367868976 +0000 UTC m=+796.565728070" watchObservedRunningTime="2025-10-09 14:04:09.373830226 +0000 UTC m=+796.571689290" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.375662 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" event={"ID":"55efad12-5eb1-4c57-bb2f-700ead538209","Type":"ContainerStarted","Data":"62e89238f6772ea9c527fc67ae79ed4587b964768c5335bb5c513acbffc9362a"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.376623 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.382130 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" event={"ID":"ac925db8-cb97-468e-b43f-b219deb78cf6","Type":"ContainerStarted","Data":"881de222ca67ff1dd1a642e26b3b366fc086362a8d33bf3acbec49d35bcbd61a"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.382230 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.384917 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" event={"ID":"4ad4e07d-4f69-4f1f-9886-bde91ec3b735","Type":"ContainerStarted","Data":"24f68407ee212ca6a76e1fb2094204ac33c989019f1d7da4292e34da9ae462e0"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.385160 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.395438 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" podStartSLOduration=5.098875002 podStartE2EDuration="17.39539939s" podCreationTimestamp="2025-10-09 14:03:52 +0000 UTC" firstStartedPulling="2025-10-09 14:03:54.36263342 +0000 UTC m=+781.560492484" lastFinishedPulling="2025-10-09 14:04:06.659157808 +0000 UTC m=+793.857016872" observedRunningTime="2025-10-09 14:04:09.389613495 +0000 UTC m=+796.587472559" watchObservedRunningTime="2025-10-09 14:04:09.39539939 +0000 UTC m=+796.593258454" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.397203 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" event={"ID":"dad70f9e-3fb4-41ff-95f4-dc6be5277aa0","Type":"ContainerStarted","Data":"6841fccdb3f13b120f6d3baec32106d18189b44d5c38a721ab7d54cfa89232ea"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.397714 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.401208 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" event={"ID":"960aab4c-ce86-4753-b848-3367f15d962c","Type":"ContainerStarted","Data":"9bfc2f1d4c9e692a1e586370a15b71a7a2fb7a0dae2ca56f6269e3dc56072a82"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.401241 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.404111 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" event={"ID":"d11828c7-488d-414a-a024-68a46fca78e1","Type":"ContainerStarted","Data":"33b4b9555bcc1fa60c36b6a5bccf929cef952fdbb2bf042bf36fff1809e5e5d8"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.404579 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.406595 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" event={"ID":"52ec0675-fab2-43fd-a447-8896de9e78fd","Type":"ContainerStarted","Data":"3b240172ca428df90695eec803101efd3737211548b1d6b33d33a55f28d6ea50"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.407060 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.422011 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" event={"ID":"07f5d370-e69d-41f8-b65a-d25dc8b38de8","Type":"ContainerStarted","Data":"5b1c9597a5a4c7a7d6854de2f1567f0acf127668cc55e4a77bbfea243763676a"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.422800 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.429320 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" podStartSLOduration=4.476118563 podStartE2EDuration="16.429292044s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:54.792871638 +0000 UTC m=+781.990730702" lastFinishedPulling="2025-10-09 14:04:06.746045119 +0000 UTC m=+793.943904183" observedRunningTime="2025-10-09 14:04:09.415888533 +0000 UTC m=+796.613747597" watchObservedRunningTime="2025-10-09 14:04:09.429292044 +0000 UTC m=+796.627151378" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.438739 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" event={"ID":"0dafd5d3-f605-4c75-86ca-8d40831e9cb7","Type":"ContainerStarted","Data":"48333b635c4a0419eeb107a88b847b81dd4f159552be381c292ebf0f75990713"} Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.439725 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.449240 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" podStartSLOduration=5.166952478 podStartE2EDuration="17.449217171s" podCreationTimestamp="2025-10-09 14:03:52 +0000 UTC" firstStartedPulling="2025-10-09 14:03:54.431275012 +0000 UTC m=+781.629134076" lastFinishedPulling="2025-10-09 14:04:06.713539705 +0000 UTC m=+793.911398769" observedRunningTime="2025-10-09 14:04:09.442992914 +0000 UTC m=+796.640851998" watchObservedRunningTime="2025-10-09 14:04:09.449217171 +0000 UTC m=+796.647076245" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.471228 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" podStartSLOduration=5.240898499 podStartE2EDuration="16.471206556s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.546016204 +0000 UTC m=+782.743875268" lastFinishedPulling="2025-10-09 14:04:06.776324261 +0000 UTC m=+793.974183325" observedRunningTime="2025-10-09 14:04:09.467973514 +0000 UTC m=+796.665832598" watchObservedRunningTime="2025-10-09 14:04:09.471206556 +0000 UTC m=+796.669065620" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.494878 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" podStartSLOduration=4.569930961 podStartE2EDuration="16.494859609s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:54.790928923 +0000 UTC m=+781.988787987" lastFinishedPulling="2025-10-09 14:04:06.715857571 +0000 UTC m=+793.913716635" observedRunningTime="2025-10-09 14:04:09.489729623 +0000 UTC m=+796.687588687" watchObservedRunningTime="2025-10-09 14:04:09.494859609 +0000 UTC m=+796.692718693" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.524813 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" podStartSLOduration=5.378444704 podStartE2EDuration="16.524782431s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.571019696 +0000 UTC m=+782.768878760" lastFinishedPulling="2025-10-09 14:04:06.717357423 +0000 UTC m=+793.915216487" observedRunningTime="2025-10-09 14:04:09.507329804 +0000 UTC m=+796.705188888" watchObservedRunningTime="2025-10-09 14:04:09.524782431 +0000 UTC m=+796.722641495" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.591176 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" podStartSLOduration=5.577194677 podStartE2EDuration="16.591154659s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.748371561 +0000 UTC m=+782.946230625" lastFinishedPulling="2025-10-09 14:04:06.762331543 +0000 UTC m=+793.960190607" observedRunningTime="2025-10-09 14:04:09.550922254 +0000 UTC m=+796.748781328" watchObservedRunningTime="2025-10-09 14:04:09.591154659 +0000 UTC m=+796.789013723" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.593579 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" podStartSLOduration=5.460421095 podStartE2EDuration="16.593570677s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.543599686 +0000 UTC m=+782.741458750" lastFinishedPulling="2025-10-09 14:04:06.676749258 +0000 UTC m=+793.874608332" observedRunningTime="2025-10-09 14:04:09.576462391 +0000 UTC m=+796.774321455" watchObservedRunningTime="2025-10-09 14:04:09.593570677 +0000 UTC m=+796.791429741" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.603698 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" podStartSLOduration=4.95760554 podStartE2EDuration="16.603680725s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.071932837 +0000 UTC m=+782.269791901" lastFinishedPulling="2025-10-09 14:04:06.718008022 +0000 UTC m=+793.915867086" observedRunningTime="2025-10-09 14:04:09.598519148 +0000 UTC m=+796.796378222" watchObservedRunningTime="2025-10-09 14:04:09.603680725 +0000 UTC m=+796.801539789" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.624378 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" podStartSLOduration=5.364194737 podStartE2EDuration="16.624353873s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.454976844 +0000 UTC m=+782.652835908" lastFinishedPulling="2025-10-09 14:04:06.71513598 +0000 UTC m=+793.912995044" observedRunningTime="2025-10-09 14:04:09.622427888 +0000 UTC m=+796.820286972" watchObservedRunningTime="2025-10-09 14:04:09.624353873 +0000 UTC m=+796.822212937" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.650322 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" podStartSLOduration=6.621144307 podStartE2EDuration="17.650301471s" podCreationTimestamp="2025-10-09 14:03:52 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.736403331 +0000 UTC m=+782.934262385" lastFinishedPulling="2025-10-09 14:04:06.765560475 +0000 UTC m=+793.963419549" observedRunningTime="2025-10-09 14:04:09.648792979 +0000 UTC m=+796.846652063" watchObservedRunningTime="2025-10-09 14:04:09.650301471 +0000 UTC m=+796.848160535" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.671129 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" podStartSLOduration=5.397040602 podStartE2EDuration="16.671107093s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.441075409 +0000 UTC m=+782.638934473" lastFinishedPulling="2025-10-09 14:04:06.71514191 +0000 UTC m=+793.913000964" observedRunningTime="2025-10-09 14:04:09.667000106 +0000 UTC m=+796.864859170" watchObservedRunningTime="2025-10-09 14:04:09.671107093 +0000 UTC m=+796.868966157" Oct 09 14:04:09 crc kubenswrapper[4902]: I1009 14:04:09.688101 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" podStartSLOduration=5.507844104 podStartE2EDuration="16.688073676s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.537152942 +0000 UTC m=+782.735012006" lastFinishedPulling="2025-10-09 14:04:06.717382524 +0000 UTC m=+793.915241578" observedRunningTime="2025-10-09 14:04:09.686715027 +0000 UTC m=+796.884574081" watchObservedRunningTime="2025-10-09 14:04:09.688073676 +0000 UTC m=+796.885932740" Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.467289 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" event={"ID":"be1ef3d2-4c04-4040-9d73-80655f4b9dbb","Type":"ContainerStarted","Data":"cc02da8bb9a8cfd3404862dd9eae9766817ea62caf72c0c8ba8ca7da67cc1b8c"} Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.467805 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.470651 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" event={"ID":"3adf1a7b-f2b7-4927-a026-55afe09bc5ab","Type":"ContainerStarted","Data":"0bb8056001684df889e81cc8b157860d8fffdd3ab351ed7bad014b0b7874857a"} Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.470867 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.485913 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" podStartSLOduration=3.851562656 podStartE2EDuration="19.485889991s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.800992248 +0000 UTC m=+782.998851302" lastFinishedPulling="2025-10-09 14:04:11.435319573 +0000 UTC m=+798.633178637" observedRunningTime="2025-10-09 14:04:12.484077839 +0000 UTC m=+799.681936913" watchObservedRunningTime="2025-10-09 14:04:12.485889991 +0000 UTC m=+799.683749055" Oct 09 14:04:12 crc kubenswrapper[4902]: I1009 14:04:12.506833 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" podStartSLOduration=3.812164215 podStartE2EDuration="19.506803766s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.748641289 +0000 UTC m=+782.946500353" lastFinishedPulling="2025-10-09 14:04:11.44328083 +0000 UTC m=+798.641139904" observedRunningTime="2025-10-09 14:04:12.501896686 +0000 UTC m=+799.699755760" watchObservedRunningTime="2025-10-09 14:04:12.506803766 +0000 UTC m=+799.704662830" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.338288 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-hrgs4" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.370145 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-pxsfz" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.386492 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-2nkqk" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.411247 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-7bjrw" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.493194 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" event={"ID":"5be11d13-4feb-4a12-9f9b-69a99d2fa5a4","Type":"ContainerStarted","Data":"d24c7d87bbf26cb3561830edabdebba07d1a2200d559188a47564e5387fbb906"} Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.493294 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-phbbt" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.494567 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.519847 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" podStartSLOduration=3.711436139 podStartE2EDuration="20.519824744s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.800319849 +0000 UTC m=+782.998178923" lastFinishedPulling="2025-10-09 14:04:12.608708464 +0000 UTC m=+799.806567528" observedRunningTime="2025-10-09 14:04:13.510347944 +0000 UTC m=+800.708207018" watchObservedRunningTime="2025-10-09 14:04:13.519824744 +0000 UTC m=+800.717683818" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.618822 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-ql5w7" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.802481 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-kvzf6" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.802829 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7bzzg" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.821186 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-ld6xx" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.844627 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-hxrth" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.888074 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-s6zmh" Oct 09 14:04:13 crc kubenswrapper[4902]: I1009 14:04:13.930083 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-mqn8s" Oct 09 14:04:14 crc kubenswrapper[4902]: I1009 14:04:14.105450 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-6xw4k" Oct 09 14:04:14 crc kubenswrapper[4902]: I1009 14:04:14.148195 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-xlwpm" Oct 09 14:04:14 crc kubenswrapper[4902]: I1009 14:04:14.148266 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-r4nn8" Oct 09 14:04:14 crc kubenswrapper[4902]: I1009 14:04:14.155258 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-hn9gv" Oct 09 14:04:14 crc kubenswrapper[4902]: I1009 14:04:14.192368 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-7lct5" Oct 09 14:04:15 crc kubenswrapper[4902]: I1009 14:04:15.527880 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" event={"ID":"93ae4a6d-1e42-4e45-8128-61088861873e","Type":"ContainerStarted","Data":"de95cd2ea49189e7491a358351721673da1075c1f029fe6528c15b1fa05a3434"} Oct 09 14:04:15 crc kubenswrapper[4902]: I1009 14:04:15.551004 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l" podStartSLOduration=3.575755859 podStartE2EDuration="22.550978338s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.756136942 +0000 UTC m=+782.953996006" lastFinishedPulling="2025-10-09 14:04:14.731359421 +0000 UTC m=+801.929218485" observedRunningTime="2025-10-09 14:04:15.543800154 +0000 UTC m=+802.741659228" watchObservedRunningTime="2025-10-09 14:04:15.550978338 +0000 UTC m=+802.748837402" Oct 09 14:04:16 crc kubenswrapper[4902]: I1009 14:04:16.536327 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" event={"ID":"d5375300-657d-4e1d-92af-2107cbc7972f","Type":"ContainerStarted","Data":"fbff513acb694f6555343a361f4767d47acfb31c2ca44ea394ec40a57be4c5d8"} Oct 09 14:04:16 crc kubenswrapper[4902]: I1009 14:04:16.536981 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:04:16 crc kubenswrapper[4902]: I1009 14:04:16.562245 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" podStartSLOduration=3.801715246 podStartE2EDuration="23.562227916s" podCreationTimestamp="2025-10-09 14:03:53 +0000 UTC" firstStartedPulling="2025-10-09 14:03:55.800638668 +0000 UTC m=+782.998497732" lastFinishedPulling="2025-10-09 14:04:15.561151338 +0000 UTC m=+802.759010402" observedRunningTime="2025-10-09 14:04:16.559589241 +0000 UTC m=+803.757448325" watchObservedRunningTime="2025-10-09 14:04:16.562227916 +0000 UTC m=+803.760086980" Oct 09 14:04:23 crc kubenswrapper[4902]: I1009 14:04:23.899508 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-s7bkj" Oct 09 14:04:24 crc kubenswrapper[4902]: I1009 14:04:24.007916 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-dwgtb" Oct 09 14:04:24 crc kubenswrapper[4902]: I1009 14:04:24.459362 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-9lx7z" Oct 09 14:04:24 crc kubenswrapper[4902]: I1009 14:04:24.590863 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757db22ds" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.173420 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.175251 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.176842 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7wxfz" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.176858 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.177885 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.183000 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.183487 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.227909 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.229550 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.231494 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.252034 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.326162 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptd9l\" (UniqueName: \"kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.326255 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.428179 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptd9l\" (UniqueName: \"kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.428299 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttskw\" (UniqueName: \"kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.428349 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.428397 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.428444 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.429589 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.461141 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptd9l\" (UniqueName: \"kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l\") pod \"dnsmasq-dns-675f4bcbfc-89v2s\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.499540 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.529971 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttskw\" (UniqueName: \"kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.530079 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.530105 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.531102 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.531156 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.548768 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttskw\" (UniqueName: \"kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw\") pod \"dnsmasq-dns-78dd6ddcc-vhp5f\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.847634 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:40 crc kubenswrapper[4902]: I1009 14:04:40.940789 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:40 crc kubenswrapper[4902]: W1009 14:04:40.953777 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44929fa_d053_4d3a_8c52_b4ec6e983a96.slice/crio-7346a189d5d52443bd4f9ce83ae61351456541f1c2710eb68aefcd7abab3e230 WatchSource:0}: Error finding container 7346a189d5d52443bd4f9ce83ae61351456541f1c2710eb68aefcd7abab3e230: Status 404 returned error can't find the container with id 7346a189d5d52443bd4f9ce83ae61351456541f1c2710eb68aefcd7abab3e230 Oct 09 14:04:41 crc kubenswrapper[4902]: I1009 14:04:41.269330 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:41 crc kubenswrapper[4902]: W1009 14:04:41.270354 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c959e86_9eec_4c81_aae8_b53430ce2695.slice/crio-2671be6613dfaa85520384400a57774d1162648f0009830a4f4c3ede44562c44 WatchSource:0}: Error finding container 2671be6613dfaa85520384400a57774d1162648f0009830a4f4c3ede44562c44: Status 404 returned error can't find the container with id 2671be6613dfaa85520384400a57774d1162648f0009830a4f4c3ede44562c44 Oct 09 14:04:41 crc kubenswrapper[4902]: I1009 14:04:41.723789 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" event={"ID":"c44929fa-d053-4d3a-8c52-b4ec6e983a96","Type":"ContainerStarted","Data":"7346a189d5d52443bd4f9ce83ae61351456541f1c2710eb68aefcd7abab3e230"} Oct 09 14:04:41 crc kubenswrapper[4902]: I1009 14:04:41.725426 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" event={"ID":"8c959e86-9eec-4c81-aae8-b53430ce2695","Type":"ContainerStarted","Data":"2671be6613dfaa85520384400a57774d1162648f0009830a4f4c3ede44562c44"} Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.077455 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.134499 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.136759 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.145599 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.277102 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.277212 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjh76\" (UniqueName: \"kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.277298 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.378340 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.378503 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjh76\" (UniqueName: \"kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.378579 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.379547 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.379643 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.418510 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjh76\" (UniqueName: \"kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76\") pod \"dnsmasq-dns-5ccc8479f9-rgtzl\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.439572 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.472567 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.474979 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.483600 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.491476 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.581761 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.581901 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z96fc\" (UniqueName: \"kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.581954 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.683875 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z96fc\" (UniqueName: \"kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.684263 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.684457 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.685349 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.685687 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.712839 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z96fc\" (UniqueName: \"kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc\") pod \"dnsmasq-dns-57d769cc4f-vtncl\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:43 crc kubenswrapper[4902]: I1009 14:04:43.828838 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.269369 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.279650 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.281893 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.287803 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.288230 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.288710 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbwcp" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.288906 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.289107 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.289298 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.296179 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394705 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394788 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394833 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394903 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394949 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvw9p\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.394974 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.396000 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.396066 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.396126 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.396188 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.396296 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.497806 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.497889 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.497916 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.497945 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.497987 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498027 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvw9p\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498055 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498081 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498112 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498134 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498152 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.498482 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.499320 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.500738 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.500788 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.500752 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.501535 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.503766 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.504161 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.504264 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.505700 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.517367 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvw9p\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.534663 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.607655 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.608963 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.609180 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.614092 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.614493 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6qng5" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.614573 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.614872 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.617101 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.618392 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.628528 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.640813 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702482 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702545 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702582 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702635 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702679 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702753 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702778 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tj74\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702809 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702834 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702862 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.702887 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804158 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804201 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tj74\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804229 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804252 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804279 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804302 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804345 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804363 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804385 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804399 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804447 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.804564 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.805436 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.806052 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.806332 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.806418 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.807077 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.809986 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.810332 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.810377 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.822953 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.824231 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tj74\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.847988 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " pod="openstack/rabbitmq-server-0" Oct 09 14:04:44 crc kubenswrapper[4902]: I1009 14:04:44.942372 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.116803 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.123334 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.127143 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-t5dww" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.127188 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.127618 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.127788 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.128156 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.128242 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.135199 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.238361 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.239660 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.241951 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.241951 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9fmwl" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.242283 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.245568 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.247882 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-secrets\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.247938 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-default\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.247975 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248013 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248059 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248085 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248119 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kolla-config\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248159 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlm5g\" (UniqueName: \"kubernetes.io/projected/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kube-api-access-dlm5g\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.248226 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.251302 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349459 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlm5g\" (UniqueName: \"kubernetes.io/projected/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kube-api-access-dlm5g\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349511 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349551 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349589 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349620 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349658 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349688 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-secrets\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349711 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-default\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349738 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349767 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349794 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349830 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hx2n\" (UniqueName: \"kubernetes.io/projected/a4dec46d-073a-484c-ba80-0ff939025e48-kube-api-access-7hx2n\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349852 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349879 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349906 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349930 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349950 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.349985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kolla-config\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.351237 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.351716 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.352544 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kolla-config\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.352743 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-config-data-default\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.354838 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.357347 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.358242 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-secrets\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.360519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.371159 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlm5g\" (UniqueName: \"kubernetes.io/projected/4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e-kube-api-access-dlm5g\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.376212 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e\") " pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451665 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451721 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451749 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451770 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451818 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451852 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hx2n\" (UniqueName: \"kubernetes.io/projected/a4dec46d-073a-484c-ba80-0ff939025e48-kube-api-access-7hx2n\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451868 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451889 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.451907 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.452620 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.452758 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.453043 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.453319 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4dec46d-073a-484c-ba80-0ff939025e48-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.453576 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4dec46d-073a-484c-ba80-0ff939025e48-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.456685 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.458724 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.460056 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dec46d-073a-484c-ba80-0ff939025e48-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.462938 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.481978 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hx2n\" (UniqueName: \"kubernetes.io/projected/a4dec46d-073a-484c-ba80-0ff939025e48-kube-api-access-7hx2n\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.483599 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a4dec46d-073a-484c-ba80-0ff939025e48\") " pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.563582 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.764401 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.765693 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.767989 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.768208 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kj4bx" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.768380 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.787963 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.860073 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-combined-ca-bundle\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.860131 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kolla-config\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.860179 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-config-data\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.860236 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xht\" (UniqueName: \"kubernetes.io/projected/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kube-api-access-h6xht\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.860271 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-memcached-tls-certs\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.961547 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-combined-ca-bundle\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.961595 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kolla-config\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.961646 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-config-data\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.961695 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xht\" (UniqueName: \"kubernetes.io/projected/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kube-api-access-h6xht\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.961730 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-memcached-tls-certs\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.963226 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-config-data\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.964039 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kolla-config\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.971709 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-memcached-tls-certs\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.972330 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590e6023-7dbe-499f-a8ea-4b8c3e24f747-combined-ca-bundle\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:47 crc kubenswrapper[4902]: I1009 14:04:47.982688 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xht\" (UniqueName: \"kubernetes.io/projected/590e6023-7dbe-499f-a8ea-4b8c3e24f747-kube-api-access-h6xht\") pod \"memcached-0\" (UID: \"590e6023-7dbe-499f-a8ea-4b8c3e24f747\") " pod="openstack/memcached-0" Oct 09 14:04:48 crc kubenswrapper[4902]: I1009 14:04:48.097568 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.575148 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.577287 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.580190 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5cgqh" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.584678 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.694818 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rjwv\" (UniqueName: \"kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv\") pod \"kube-state-metrics-0\" (UID: \"9dc74bb1-5a59-4876-83ff-09c1a2cb6042\") " pod="openstack/kube-state-metrics-0" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.796916 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rjwv\" (UniqueName: \"kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv\") pod \"kube-state-metrics-0\" (UID: \"9dc74bb1-5a59-4876-83ff-09c1a2cb6042\") " pod="openstack/kube-state-metrics-0" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.817985 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rjwv\" (UniqueName: \"kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv\") pod \"kube-state-metrics-0\" (UID: \"9dc74bb1-5a59-4876-83ff-09c1a2cb6042\") " pod="openstack/kube-state-metrics-0" Oct 09 14:04:49 crc kubenswrapper[4902]: I1009 14:04:49.912075 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:04:50 crc kubenswrapper[4902]: I1009 14:04:50.512329 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.420784 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-79djm"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.422017 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.424436 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.424917 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.427806 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-glxf8" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.436469 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79djm"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.480159 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rlj4x"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.482208 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.496052 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rlj4x"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574547 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhkk\" (UniqueName: \"kubernetes.io/projected/007b48e7-2e7a-45e6-bc70-1c86a275d808-kube-api-access-cbhkk\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574672 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-combined-ca-bundle\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574731 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-log-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574767 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-lib\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574890 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-etc-ovs\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574927 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.574951 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-ovn-controller-tls-certs\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575028 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-log\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575070 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-run\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575131 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007b48e7-2e7a-45e6-bc70-1c86a275d808-scripts\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575154 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzcp9\" (UniqueName: \"kubernetes.io/projected/7f0722a8-eee2-4bb1-a3b4-d14964d35227-kube-api-access-tzcp9\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575199 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0722a8-eee2-4bb1-a3b4-d14964d35227-scripts\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.575256 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.676850 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-log\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.676921 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-run\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.676979 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzcp9\" (UniqueName: \"kubernetes.io/projected/7f0722a8-eee2-4bb1-a3b4-d14964d35227-kube-api-access-tzcp9\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.676997 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007b48e7-2e7a-45e6-bc70-1c86a275d808-scripts\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677027 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0722a8-eee2-4bb1-a3b4-d14964d35227-scripts\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677065 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677086 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhkk\" (UniqueName: \"kubernetes.io/projected/007b48e7-2e7a-45e6-bc70-1c86a275d808-kube-api-access-cbhkk\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677106 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-combined-ca-bundle\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677284 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-log-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677307 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-lib\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677381 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-etc-ovs\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677422 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.677444 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-ovn-controller-tls-certs\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.678367 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-lib\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.678387 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-log-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.678552 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-etc-ovs\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.678884 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-log\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.678999 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/007b48e7-2e7a-45e6-bc70-1c86a275d808-var-run\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.679093 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run-ovn\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.679099 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0722a8-eee2-4bb1-a3b4-d14964d35227-var-run\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.679224 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0722a8-eee2-4bb1-a3b4-d14964d35227-scripts\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.679660 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/007b48e7-2e7a-45e6-bc70-1c86a275d808-scripts\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.683953 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-combined-ca-bundle\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.694177 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0722a8-eee2-4bb1-a3b4-d14964d35227-ovn-controller-tls-certs\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.696661 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzcp9\" (UniqueName: \"kubernetes.io/projected/7f0722a8-eee2-4bb1-a3b4-d14964d35227-kube-api-access-tzcp9\") pod \"ovn-controller-79djm\" (UID: \"7f0722a8-eee2-4bb1-a3b4-d14964d35227\") " pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.697195 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhkk\" (UniqueName: \"kubernetes.io/projected/007b48e7-2e7a-45e6-bc70-1c86a275d808-kube-api-access-cbhkk\") pod \"ovn-controller-ovs-rlj4x\" (UID: \"007b48e7-2e7a-45e6-bc70-1c86a275d808\") " pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.754012 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.780060 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.781759 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.785177 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.785214 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.785234 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.785244 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.788095 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.793234 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hcjrb" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.807904 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880387 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880458 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880495 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880518 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2pm7\" (UniqueName: \"kubernetes.io/projected/ea130bce-ed3c-495f-b06b-14278e3133ca-kube-api-access-z2pm7\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880545 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880575 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880602 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.880631 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982119 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982167 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982208 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982240 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2pm7\" (UniqueName: \"kubernetes.io/projected/ea130bce-ed3c-495f-b06b-14278e3133ca-kube-api-access-z2pm7\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982270 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982302 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982331 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982363 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.982596 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.983345 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.983912 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.986046 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea130bce-ed3c-495f-b06b-14278e3133ca-config\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.989616 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:53 crc kubenswrapper[4902]: I1009 14:04:53.990269 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:54 crc kubenswrapper[4902]: I1009 14:04:54.000357 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea130bce-ed3c-495f-b06b-14278e3133ca-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:54 crc kubenswrapper[4902]: I1009 14:04:54.000848 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2pm7\" (UniqueName: \"kubernetes.io/projected/ea130bce-ed3c-495f-b06b-14278e3133ca-kube-api-access-z2pm7\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:54 crc kubenswrapper[4902]: I1009 14:04:54.016825 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ea130bce-ed3c-495f-b06b-14278e3133ca\") " pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:54 crc kubenswrapper[4902]: I1009 14:04:54.111500 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 14:04:55 crc kubenswrapper[4902]: I1009 14:04:55.105952 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:04:55 crc kubenswrapper[4902]: I1009 14:04:55.854491 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerStarted","Data":"88b16d0cb0aa583376248d4f0c7c1364f71053d6d2971266163d1819d0305376"} Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.039582 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.039788 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ttskw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vhp5f_openstack(8c959e86-9eec-4c81-aae8-b53430ce2695): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.041158 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" podUID="8c959e86-9eec-4c81-aae8-b53430ce2695" Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.065347 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.065855 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptd9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-89v2s_openstack(c44929fa-d053-4d3a-8c52-b4ec6e983a96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:04:56 crc kubenswrapper[4902]: E1009 14:04:56.067092 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" podUID="c44929fa-d053-4d3a-8c52-b4ec6e983a96" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.485840 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: W1009 14:04:56.494449 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590e6023_7dbe_499f_a8ea_4b8c3e24f747.slice/crio-9881f731c9f5ec0a7d31b7e88eaf101df3c51e3cf676a1472abcb2ca4c80117a WatchSource:0}: Error finding container 9881f731c9f5ec0a7d31b7e88eaf101df3c51e3cf676a1472abcb2ca4c80117a: Status 404 returned error can't find the container with id 9881f731c9f5ec0a7d31b7e88eaf101df3c51e3cf676a1472abcb2ca4c80117a Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.517371 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:04:56 crc kubenswrapper[4902]: W1009 14:04:56.523978 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75f07b5d_6f2c_473f_ae50_725bcdf084af.slice/crio-f91533ce29f955c7f932d405082cade5284aad50093ec1c5ef7132353298d937 WatchSource:0}: Error finding container f91533ce29f955c7f932d405082cade5284aad50093ec1c5ef7132353298d937: Status 404 returned error can't find the container with id f91533ce29f955c7f932d405082cade5284aad50093ec1c5ef7132353298d937 Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.672610 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.687321 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.696394 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: W1009 14:04:56.697917 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c6af38_1605_4d47_bc0c_967053235667.slice/crio-af4b54366c36b58e0338fb3bc8fbd62ae03622c593f03b9533fe5b96162932c4 WatchSource:0}: Error finding container af4b54366c36b58e0338fb3bc8fbd62ae03622c593f03b9533fe5b96162932c4: Status 404 returned error can't find the container with id af4b54366c36b58e0338fb3bc8fbd62ae03622c593f03b9533fe5b96162932c4 Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.868569 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"590e6023-7dbe-499f-a8ea-4b8c3e24f747","Type":"ContainerStarted","Data":"9881f731c9f5ec0a7d31b7e88eaf101df3c51e3cf676a1472abcb2ca4c80117a"} Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.877653 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4dec46d-073a-484c-ba80-0ff939025e48","Type":"ContainerStarted","Data":"aff601d70f8fdd66404efa749d0a19ccdf336dcc160ad9df81845eb219453b8a"} Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.880080 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" event={"ID":"75f07b5d-6f2c-473f-ae50-725bcdf084af","Type":"ContainerStarted","Data":"f91533ce29f955c7f932d405082cade5284aad50093ec1c5ef7132353298d937"} Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.882185 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerStarted","Data":"af4b54366c36b58e0338fb3bc8fbd62ae03622c593f03b9533fe5b96162932c4"} Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.884448 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.887027 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e","Type":"ContainerStarted","Data":"171f48d69baa6e4dc98ec15ebf877617f1705d34cfabcb5b4791dd3759f1560b"} Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.891382 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:04:56 crc kubenswrapper[4902]: W1009 14:04:56.897135 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dc74bb1_5a59_4876_83ff_09c1a2cb6042.slice/crio-9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721 WatchSource:0}: Error finding container 9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721: Status 404 returned error can't find the container with id 9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721 Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.899841 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79djm"] Oct 09 14:04:56 crc kubenswrapper[4902]: W1009 14:04:56.899853 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f0722a8_eee2_4bb1_a3b4_d14964d35227.slice/crio-6c6b0a29a4150d8cc47b746bca4d2fdffc80fd4982007ee3cf5ea836385c38f1 WatchSource:0}: Error finding container 6c6b0a29a4150d8cc47b746bca4d2fdffc80fd4982007ee3cf5ea836385c38f1: Status 404 returned error can't find the container with id 6c6b0a29a4150d8cc47b746bca4d2fdffc80fd4982007ee3cf5ea836385c38f1 Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.938744 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.941152 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.948521 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-szhr6" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.948655 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.948830 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.949167 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 09 14:04:56 crc kubenswrapper[4902]: I1009 14:04:56.954098 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.043188 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051130 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051183 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051249 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tmkv\" (UniqueName: \"kubernetes.io/projected/1676b099-df2c-477b-a05b-b46d47dc3b05-kube-api-access-2tmkv\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051279 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051301 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051334 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-config\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051350 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.051364 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: W1009 14:04:57.062532 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea130bce_ed3c_495f_b06b_14278e3133ca.slice/crio-44842bcdd1ce8660c323878e144123f3c53e00e9d5f884863ca1b33342333872 WatchSource:0}: Error finding container 44842bcdd1ce8660c323878e144123f3c53e00e9d5f884863ca1b33342333872: Status 404 returned error can't find the container with id 44842bcdd1ce8660c323878e144123f3c53e00e9d5f884863ca1b33342333872 Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.152367 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.152843 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.152948 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154193 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tmkv\" (UniqueName: \"kubernetes.io/projected/1676b099-df2c-477b-a05b-b46d47dc3b05-kube-api-access-2tmkv\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154243 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154272 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154320 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-config\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154337 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.154355 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.157168 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.157573 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-config\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.160238 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1676b099-df2c-477b-a05b-b46d47dc3b05-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.160600 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.165422 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.177333 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tmkv\" (UniqueName: \"kubernetes.io/projected/1676b099-df2c-477b-a05b-b46d47dc3b05-kube-api-access-2tmkv\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.177840 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1676b099-df2c-477b-a05b-b46d47dc3b05-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.194383 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1676b099-df2c-477b-a05b-b46d47dc3b05\") " pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.280802 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.291722 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.298800 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.357560 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttskw\" (UniqueName: \"kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw\") pod \"8c959e86-9eec-4c81-aae8-b53430ce2695\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.357681 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config\") pod \"8c959e86-9eec-4c81-aae8-b53430ce2695\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.357760 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptd9l\" (UniqueName: \"kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l\") pod \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.357818 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc\") pod \"8c959e86-9eec-4c81-aae8-b53430ce2695\" (UID: \"8c959e86-9eec-4c81-aae8-b53430ce2695\") " Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.357859 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config\") pod \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\" (UID: \"c44929fa-d053-4d3a-8c52-b4ec6e983a96\") " Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.358653 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c959e86-9eec-4c81-aae8-b53430ce2695" (UID: "8c959e86-9eec-4c81-aae8-b53430ce2695"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.358685 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config" (OuterVolumeSpecName: "config") pod "8c959e86-9eec-4c81-aae8-b53430ce2695" (UID: "8c959e86-9eec-4c81-aae8-b53430ce2695"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.358839 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config" (OuterVolumeSpecName: "config") pod "c44929fa-d053-4d3a-8c52-b4ec6e983a96" (UID: "c44929fa-d053-4d3a-8c52-b4ec6e983a96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.362440 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l" (OuterVolumeSpecName: "kube-api-access-ptd9l") pod "c44929fa-d053-4d3a-8c52-b4ec6e983a96" (UID: "c44929fa-d053-4d3a-8c52-b4ec6e983a96"). InnerVolumeSpecName "kube-api-access-ptd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.375039 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw" (OuterVolumeSpecName: "kube-api-access-ttskw") pod "8c959e86-9eec-4c81-aae8-b53430ce2695" (UID: "8c959e86-9eec-4c81-aae8-b53430ce2695"). InnerVolumeSpecName "kube-api-access-ttskw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.460571 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.460864 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptd9l\" (UniqueName: \"kubernetes.io/projected/c44929fa-d053-4d3a-8c52-b4ec6e983a96-kube-api-access-ptd9l\") on node \"crc\" DevicePath \"\"" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.460878 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c959e86-9eec-4c81-aae8-b53430ce2695-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.460892 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c44929fa-d053-4d3a-8c52-b4ec6e983a96-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.460904 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttskw\" (UniqueName: \"kubernetes.io/projected/8c959e86-9eec-4c81-aae8-b53430ce2695-kube-api-access-ttskw\") on node \"crc\" DevicePath \"\"" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.865506 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.895799 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" event={"ID":"c44929fa-d053-4d3a-8c52-b4ec6e983a96","Type":"ContainerDied","Data":"7346a189d5d52443bd4f9ce83ae61351456541f1c2710eb68aefcd7abab3e230"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.895833 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-89v2s" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.899574 4902 generic.go:334] "Generic (PLEG): container finished" podID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerID="c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8" exitCode=0 Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.899667 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" event={"ID":"75f07b5d-6f2c-473f-ae50-725bcdf084af","Type":"ContainerDied","Data":"c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.901292 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dc74bb1-5a59-4876-83ff-09c1a2cb6042","Type":"ContainerStarted","Data":"9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.902473 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea130bce-ed3c-495f-b06b-14278e3133ca","Type":"ContainerStarted","Data":"44842bcdd1ce8660c323878e144123f3c53e00e9d5f884863ca1b33342333872"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.904933 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79djm" event={"ID":"7f0722a8-eee2-4bb1-a3b4-d14964d35227","Type":"ContainerStarted","Data":"6c6b0a29a4150d8cc47b746bca4d2fdffc80fd4982007ee3cf5ea836385c38f1"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.915627 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" event={"ID":"ad8edf23-ca42-4f81-a01a-4bd897f23934","Type":"ContainerStarted","Data":"1da187a1d8a4506dcb0ea8da2856bd47aab43e6a02dc078aa036caf218aeec71"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.922024 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" event={"ID":"8c959e86-9eec-4c81-aae8-b53430ce2695","Type":"ContainerDied","Data":"2671be6613dfaa85520384400a57774d1162648f0009830a4f4c3ede44562c44"} Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.922119 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vhp5f" Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.970293 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:57 crc kubenswrapper[4902]: I1009 14:04:57.976118 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-89v2s"] Oct 09 14:04:58 crc kubenswrapper[4902]: I1009 14:04:58.013052 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rlj4x"] Oct 09 14:04:58 crc kubenswrapper[4902]: I1009 14:04:58.024226 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:58 crc kubenswrapper[4902]: I1009 14:04:58.031190 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vhp5f"] Oct 09 14:04:59 crc kubenswrapper[4902]: W1009 14:04:59.227205 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1676b099_df2c_477b_a05b_b46d47dc3b05.slice/crio-4f60e34aed0dda00408dbafb6a5c9bf8e8f9c958009cfe9da1dbd0959433d520 WatchSource:0}: Error finding container 4f60e34aed0dda00408dbafb6a5c9bf8e8f9c958009cfe9da1dbd0959433d520: Status 404 returned error can't find the container with id 4f60e34aed0dda00408dbafb6a5c9bf8e8f9c958009cfe9da1dbd0959433d520 Oct 09 14:04:59 crc kubenswrapper[4902]: I1009 14:04:59.525897 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c959e86-9eec-4c81-aae8-b53430ce2695" path="/var/lib/kubelet/pods/8c959e86-9eec-4c81-aae8-b53430ce2695/volumes" Oct 09 14:04:59 crc kubenswrapper[4902]: I1009 14:04:59.526332 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44929fa-d053-4d3a-8c52-b4ec6e983a96" path="/var/lib/kubelet/pods/c44929fa-d053-4d3a-8c52-b4ec6e983a96/volumes" Oct 09 14:04:59 crc kubenswrapper[4902]: I1009 14:04:59.944589 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rlj4x" event={"ID":"007b48e7-2e7a-45e6-bc70-1c86a275d808","Type":"ContainerStarted","Data":"fd1e4138c8f2df4551faed4c871b7d73c0646bbd96750ec881f98f7112a3281a"} Oct 09 14:04:59 crc kubenswrapper[4902]: I1009 14:04:59.946300 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1676b099-df2c-477b-a05b-b46d47dc3b05","Type":"ContainerStarted","Data":"4f60e34aed0dda00408dbafb6a5c9bf8e8f9c958009cfe9da1dbd0959433d520"} Oct 09 14:05:02 crc kubenswrapper[4902]: I1009 14:05:02.969165 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" event={"ID":"75f07b5d-6f2c-473f-ae50-725bcdf084af","Type":"ContainerStarted","Data":"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c"} Oct 09 14:05:02 crc kubenswrapper[4902]: I1009 14:05:02.970543 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:05:02 crc kubenswrapper[4902]: I1009 14:05:02.988277 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" podStartSLOduration=19.535192464 podStartE2EDuration="19.98825983s" podCreationTimestamp="2025-10-09 14:04:43 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.528019307 +0000 UTC m=+843.725878371" lastFinishedPulling="2025-10-09 14:04:56.981086683 +0000 UTC m=+844.178945737" observedRunningTime="2025-10-09 14:05:02.986310948 +0000 UTC m=+850.184170022" watchObservedRunningTime="2025-10-09 14:05:02.98825983 +0000 UTC m=+850.186118894" Oct 09 14:05:06 crc kubenswrapper[4902]: I1009 14:05:06.007684 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e","Type":"ContainerStarted","Data":"ac04ff204ded85f13ac9baef5b110614f07e362602b1765a8973b4455ce8fb11"} Oct 09 14:05:06 crc kubenswrapper[4902]: I1009 14:05:06.009500 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"590e6023-7dbe-499f-a8ea-4b8c3e24f747","Type":"ContainerStarted","Data":"170a2f54f57c2d8447ba5a840e87c0c65eacee79f3a7702994ccacb7820b9829"} Oct 09 14:05:06 crc kubenswrapper[4902]: I1009 14:05:06.009626 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 09 14:05:06 crc kubenswrapper[4902]: I1009 14:05:06.011115 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4dec46d-073a-484c-ba80-0ff939025e48","Type":"ContainerStarted","Data":"416242b1a15ac39e0eed89742483d9b0e76bb6c535dbe29a5abf644c32d8e69e"} Oct 09 14:05:06 crc kubenswrapper[4902]: I1009 14:05:06.072052 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.373714988 podStartE2EDuration="19.072037519s" podCreationTimestamp="2025-10-09 14:04:47 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.497220007 +0000 UTC m=+843.695079081" lastFinishedPulling="2025-10-09 14:05:02.195542548 +0000 UTC m=+849.393401612" observedRunningTime="2025-10-09 14:05:06.071381131 +0000 UTC m=+853.269240205" watchObservedRunningTime="2025-10-09 14:05:06.072037519 +0000 UTC m=+853.269896583" Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.021485 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1676b099-df2c-477b-a05b-b46d47dc3b05","Type":"ContainerStarted","Data":"f579deccfdee4c841e0b27c9f402b3f13f2c8f049c33905d2bb82505bb44c1b2"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.023310 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerStarted","Data":"804388511a83b9bbcfe911cb67ebf26bcf159b46ac05551143b014d995792cd7"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.025810 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dc74bb1-5a59-4876-83ff-09c1a2cb6042","Type":"ContainerStarted","Data":"3d9139f332643896c9a5dfb4564992a3f14e8a4a98d85e66efe7c26be95cfe1f"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.026320 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.027821 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerStarted","Data":"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.029922 4902 generic.go:334] "Generic (PLEG): container finished" podID="007b48e7-2e7a-45e6-bc70-1c86a275d808" containerID="f62caf0c19f4d4e32940aa47b328029ffcacb0f0d5830b2b3965466b70c006b8" exitCode=0 Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.029994 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rlj4x" event={"ID":"007b48e7-2e7a-45e6-bc70-1c86a275d808","Type":"ContainerDied","Data":"f62caf0c19f4d4e32940aa47b328029ffcacb0f0d5830b2b3965466b70c006b8"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.033570 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea130bce-ed3c-495f-b06b-14278e3133ca","Type":"ContainerStarted","Data":"11baf677a39ae54aa814098216804639eb0efc2cebe5c1f980f9fc6858d49f7f"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.036372 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79djm" event={"ID":"7f0722a8-eee2-4bb1-a3b4-d14964d35227","Type":"ContainerStarted","Data":"e4d035b8aba84be65cb27b19c930f7b18f97a2231b17b0911e5ffa6be95fb5f1"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.037357 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-79djm" Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.038809 4902 generic.go:334] "Generic (PLEG): container finished" podID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerID="b716cb8932cc43dace3456c1b09e41e873eddb43657fdc87e66f8e80486f5d1f" exitCode=0 Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.039528 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" event={"ID":"ad8edf23-ca42-4f81-a01a-4bd897f23934","Type":"ContainerDied","Data":"b716cb8932cc43dace3456c1b09e41e873eddb43657fdc87e66f8e80486f5d1f"} Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.085186 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-79djm" podStartSLOduration=6.065158734 podStartE2EDuration="14.085170355s" podCreationTimestamp="2025-10-09 14:04:53 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.90755212 +0000 UTC m=+844.105411184" lastFinishedPulling="2025-10-09 14:05:04.927563751 +0000 UTC m=+852.125422805" observedRunningTime="2025-10-09 14:05:07.08089665 +0000 UTC m=+854.278755744" watchObservedRunningTime="2025-10-09 14:05:07.085170355 +0000 UTC m=+854.283029409" Oct 09 14:05:07 crc kubenswrapper[4902]: I1009 14:05:07.137307 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.579609933 podStartE2EDuration="18.13728474s" podCreationTimestamp="2025-10-09 14:04:49 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.900150651 +0000 UTC m=+844.098009715" lastFinishedPulling="2025-10-09 14:05:05.457825458 +0000 UTC m=+852.655684522" observedRunningTime="2025-10-09 14:05:07.132948063 +0000 UTC m=+854.330807127" watchObservedRunningTime="2025-10-09 14:05:07.13728474 +0000 UTC m=+854.335143814" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.051526 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" event={"ID":"ad8edf23-ca42-4f81-a01a-4bd897f23934","Type":"ContainerStarted","Data":"49c1a49e4283c9bb8aaf2840f43fc5ee05b89cf0d30d9ecae8b0ba2e9b75880c"} Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.051919 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.057839 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rlj4x" event={"ID":"007b48e7-2e7a-45e6-bc70-1c86a275d808","Type":"ContainerStarted","Data":"a6e3168a43de3331a667bfce5c86650cbe544e8b1f06f1ab38fbbb44e823efd6"} Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.058004 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rlj4x" event={"ID":"007b48e7-2e7a-45e6-bc70-1c86a275d808","Type":"ContainerStarted","Data":"a3a461d78f618d1643066007c28414871b5dff6188d9c292b7cf9ce66db736c2"} Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.058508 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.058629 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.076391 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" podStartSLOduration=17.710751404 podStartE2EDuration="25.07636832s" podCreationTimestamp="2025-10-09 14:04:43 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.933631313 +0000 UTC m=+844.131490377" lastFinishedPulling="2025-10-09 14:05:04.299248229 +0000 UTC m=+851.497107293" observedRunningTime="2025-10-09 14:05:08.072348252 +0000 UTC m=+855.270207336" watchObservedRunningTime="2025-10-09 14:05:08.07636832 +0000 UTC m=+855.274227394" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.098943 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rlj4x" podStartSLOduration=9.568990676 podStartE2EDuration="15.098917048s" podCreationTimestamp="2025-10-09 14:04:53 +0000 UTC" firstStartedPulling="2025-10-09 14:04:59.228983331 +0000 UTC m=+846.426842395" lastFinishedPulling="2025-10-09 14:05:04.758909683 +0000 UTC m=+851.956768767" observedRunningTime="2025-10-09 14:05:08.094142869 +0000 UTC m=+855.292001943" watchObservedRunningTime="2025-10-09 14:05:08.098917048 +0000 UTC m=+855.296776112" Oct 09 14:05:08 crc kubenswrapper[4902]: I1009 14:05:08.475427 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:05:09 crc kubenswrapper[4902]: I1009 14:05:09.066859 4902 generic.go:334] "Generic (PLEG): container finished" podID="a4dec46d-073a-484c-ba80-0ff939025e48" containerID="416242b1a15ac39e0eed89742483d9b0e76bb6c535dbe29a5abf644c32d8e69e" exitCode=0 Oct 09 14:05:09 crc kubenswrapper[4902]: I1009 14:05:09.066956 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4dec46d-073a-484c-ba80-0ff939025e48","Type":"ContainerDied","Data":"416242b1a15ac39e0eed89742483d9b0e76bb6c535dbe29a5abf644c32d8e69e"} Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.077813 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ea130bce-ed3c-495f-b06b-14278e3133ca","Type":"ContainerStarted","Data":"693fcd3c5ce0673c2774af29c663fe8dc1a7a4bc0196217ec4621b25b205433c"} Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.081448 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1676b099-df2c-477b-a05b-b46d47dc3b05","Type":"ContainerStarted","Data":"b3ca783819d50f172b88c80a2f8914428676926db8fafa7be9a7ff224b232376"} Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.083299 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4dec46d-073a-484c-ba80-0ff939025e48","Type":"ContainerStarted","Data":"bc9a1831b620261253c157a37372f9086205d4c3457ae0658fad3039389c22e0"} Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.106170 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.508842922 podStartE2EDuration="18.106154948s" podCreationTimestamp="2025-10-09 14:04:52 +0000 UTC" firstStartedPulling="2025-10-09 14:04:57.065298354 +0000 UTC m=+844.263157418" lastFinishedPulling="2025-10-09 14:05:09.66261038 +0000 UTC m=+856.860469444" observedRunningTime="2025-10-09 14:05:10.102319764 +0000 UTC m=+857.300178838" watchObservedRunningTime="2025-10-09 14:05:10.106154948 +0000 UTC m=+857.304014012" Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.123270 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.919135023 podStartE2EDuration="24.123250729s" podCreationTimestamp="2025-10-09 14:04:46 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.673175651 +0000 UTC m=+843.871034715" lastFinishedPulling="2025-10-09 14:05:01.877291347 +0000 UTC m=+849.075150421" observedRunningTime="2025-10-09 14:05:10.120839674 +0000 UTC m=+857.318698748" watchObservedRunningTime="2025-10-09 14:05:10.123250729 +0000 UTC m=+857.321109793" Oct 09 14:05:10 crc kubenswrapper[4902]: I1009 14:05:10.144212 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.710329138 podStartE2EDuration="15.144193073s" podCreationTimestamp="2025-10-09 14:04:55 +0000 UTC" firstStartedPulling="2025-10-09 14:04:59.229866205 +0000 UTC m=+846.427725269" lastFinishedPulling="2025-10-09 14:05:09.66373014 +0000 UTC m=+856.861589204" observedRunningTime="2025-10-09 14:05:10.142277962 +0000 UTC m=+857.340137036" watchObservedRunningTime="2025-10-09 14:05:10.144193073 +0000 UTC m=+857.342052137" Oct 09 14:05:11 crc kubenswrapper[4902]: I1009 14:05:11.097863 4902 generic.go:334] "Generic (PLEG): container finished" podID="4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e" containerID="ac04ff204ded85f13ac9baef5b110614f07e362602b1765a8973b4455ce8fb11" exitCode=0 Oct 09 14:05:11 crc kubenswrapper[4902]: I1009 14:05:11.097913 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e","Type":"ContainerDied","Data":"ac04ff204ded85f13ac9baef5b110614f07e362602b1765a8973b4455ce8fb11"} Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.108832 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e","Type":"ContainerStarted","Data":"5d24ae1ef4ccd955a10e08eb3bd784cc9e1aee6dfdfcfcb5c8e7869cad9a614b"} Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.111678 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.147791 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.898591949 podStartE2EDuration="26.147771986s" podCreationTimestamp="2025-10-09 14:04:46 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.697893037 +0000 UTC m=+843.895752101" lastFinishedPulling="2025-10-09 14:05:03.947073064 +0000 UTC m=+851.144932138" observedRunningTime="2025-10-09 14:05:12.143249894 +0000 UTC m=+859.341108978" watchObservedRunningTime="2025-10-09 14:05:12.147771986 +0000 UTC m=+859.345631050" Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.164456 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.281936 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.282729 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 09 14:05:12 crc kubenswrapper[4902]: I1009 14:05:12.338072 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.099149 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.122716 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.174138 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.175226 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.469266 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.469542 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" containerID="cri-o://49c1a49e4283c9bb8aaf2840f43fc5ee05b89cf0d30d9ecae8b0ba2e9b75880c" gracePeriod=10 Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.472672 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.503944 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-89klq"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.505582 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.508925 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.522828 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-89klq"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.569471 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pm8hp"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.570843 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.573785 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.583288 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvx95\" (UniqueName: \"kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.583695 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.584006 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.584149 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.593577 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pm8hp"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.680595 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-89klq"] Oct 09 14:05:13 crc kubenswrapper[4902]: E1009 14:05:13.681922 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-wvx95 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-89klq" podUID="9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689762 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689850 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovn-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689902 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vptl7\" (UniqueName: \"kubernetes.io/projected/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-kube-api-access-vptl7\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689934 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvx95\" (UniqueName: \"kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689960 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.689983 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-config\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.690002 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.690031 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovs-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.690061 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-combined-ca-bundle\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.690096 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.693355 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.694261 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.694845 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.715564 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.717266 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.724010 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.724060 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-shldt" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.724091 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.724150 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.741646 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.743266 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.752107 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.757714 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.761326 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.772443 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvx95\" (UniqueName: \"kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95\") pod \"dnsmasq-dns-7fd796d7df-89klq\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791365 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791463 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hj9\" (UniqueName: \"kubernetes.io/projected/22ef2931-973d-462a-ae3a-d05056c72468-kube-api-access-t4hj9\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791498 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791522 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791546 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-scripts\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791580 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovn-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791603 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvgd\" (UniqueName: \"kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791641 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791685 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vptl7\" (UniqueName: \"kubernetes.io/projected/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-kube-api-access-vptl7\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791708 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22ef2931-973d-462a-ae3a-d05056c72468-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791737 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791762 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791791 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-config\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791829 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-config\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791856 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791903 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovs-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791932 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.791957 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-combined-ca-bundle\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.792385 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovn-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.792527 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-ovs-rundir\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.794616 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-config\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.797783 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-combined-ca-bundle\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.798192 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.810142 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vptl7\" (UniqueName: \"kubernetes.io/projected/aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb-kube-api-access-vptl7\") pod \"ovn-controller-metrics-pm8hp\" (UID: \"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb\") " pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.830186 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: connect: connection refused" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893505 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893560 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hj9\" (UniqueName: \"kubernetes.io/projected/22ef2931-973d-462a-ae3a-d05056c72468-kube-api-access-t4hj9\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893582 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893603 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893621 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-scripts\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893644 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvgd\" (UniqueName: \"kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893669 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893707 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22ef2931-973d-462a-ae3a-d05056c72468-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893725 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893745 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893772 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-config\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.893819 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.894570 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22ef2931-973d-462a-ae3a-d05056c72468-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.894947 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-scripts\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.896292 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.896294 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.896507 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.896386 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.899088 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22ef2931-973d-462a-ae3a-d05056c72468-config\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.899169 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.900025 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.908962 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ef2931-973d-462a-ae3a-d05056c72468-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.917011 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvgd\" (UniqueName: \"kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd\") pod \"dnsmasq-dns-86db49b7ff-rtbbp\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.917023 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hj9\" (UniqueName: \"kubernetes.io/projected/22ef2931-973d-462a-ae3a-d05056c72468-kube-api-access-t4hj9\") pod \"ovn-northd-0\" (UID: \"22ef2931-973d-462a-ae3a-d05056c72468\") " pod="openstack/ovn-northd-0" Oct 09 14:05:13 crc kubenswrapper[4902]: I1009 14:05:13.941533 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pm8hp" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.130461 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.139794 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.152828 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.168166 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.203473 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb\") pod \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.203621 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config\") pod \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.203694 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvx95\" (UniqueName: \"kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95\") pod \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.203766 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc\") pod \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\" (UID: \"9114ffa5-d3de-48b6-8bdd-9280a3e1adc8\") " Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.203976 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" (UID: "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.204012 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config" (OuterVolumeSpecName: "config") pod "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" (UID: "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.204388 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" (UID: "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.204598 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.205596 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.211625 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95" (OuterVolumeSpecName: "kube-api-access-wvx95") pod "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" (UID: "9114ffa5-d3de-48b6-8bdd-9280a3e1adc8"). InnerVolumeSpecName "kube-api-access-wvx95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.306652 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvx95\" (UniqueName: \"kubernetes.io/projected/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-kube-api-access-wvx95\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.306685 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.458085 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pm8hp"] Oct 09 14:05:14 crc kubenswrapper[4902]: W1009 14:05:14.465564 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa516b5f_aa20_4f8a_bf1c_3bba6bb72ffb.slice/crio-54a2546559bb7bc4bba94b5ab75693cd2dab3af1b847964ae8c410a42d79a259 WatchSource:0}: Error finding container 54a2546559bb7bc4bba94b5ab75693cd2dab3af1b847964ae8c410a42d79a259: Status 404 returned error can't find the container with id 54a2546559bb7bc4bba94b5ab75693cd2dab3af1b847964ae8c410a42d79a259 Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.677649 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 14:05:14 crc kubenswrapper[4902]: W1009 14:05:14.680959 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22ef2931_973d_462a_ae3a_d05056c72468.slice/crio-519e836d6ddb812314612a023ad0f505da77073a180b16663f21bbc5770ce17a WatchSource:0}: Error finding container 519e836d6ddb812314612a023ad0f505da77073a180b16663f21bbc5770ce17a: Status 404 returned error can't find the container with id 519e836d6ddb812314612a023ad0f505da77073a180b16663f21bbc5770ce17a Oct 09 14:05:14 crc kubenswrapper[4902]: I1009 14:05:14.690595 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:14 crc kubenswrapper[4902]: W1009 14:05:14.700303 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf645f372_beaf_4a39_9957_18b18a365706.slice/crio-4504df46e88dcb3adf28cc5f0092091947c0cee2b23cbdf3666350c72b0945bc WatchSource:0}: Error finding container 4504df46e88dcb3adf28cc5f0092091947c0cee2b23cbdf3666350c72b0945bc: Status 404 returned error can't find the container with id 4504df46e88dcb3adf28cc5f0092091947c0cee2b23cbdf3666350c72b0945bc Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.137970 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pm8hp" event={"ID":"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb","Type":"ContainerStarted","Data":"54a2546559bb7bc4bba94b5ab75693cd2dab3af1b847964ae8c410a42d79a259"} Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.139368 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22ef2931-973d-462a-ae3a-d05056c72468","Type":"ContainerStarted","Data":"519e836d6ddb812314612a023ad0f505da77073a180b16663f21bbc5770ce17a"} Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.141005 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" event={"ID":"f645f372-beaf-4a39-9957-18b18a365706","Type":"ContainerStarted","Data":"4504df46e88dcb3adf28cc5f0092091947c0cee2b23cbdf3666350c72b0945bc"} Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.142849 4902 generic.go:334] "Generic (PLEG): container finished" podID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerID="49c1a49e4283c9bb8aaf2840f43fc5ee05b89cf0d30d9ecae8b0ba2e9b75880c" exitCode=0 Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.142983 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" event={"ID":"ad8edf23-ca42-4f81-a01a-4bd897f23934","Type":"ContainerDied","Data":"49c1a49e4283c9bb8aaf2840f43fc5ee05b89cf0d30d9ecae8b0ba2e9b75880c"} Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.143010 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-89klq" Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.196822 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-89klq"] Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.202236 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-89klq"] Oct 09 14:05:15 crc kubenswrapper[4902]: I1009 14:05:15.525317 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9114ffa5-d3de-48b6-8bdd-9280a3e1adc8" path="/var/lib/kubelet/pods/9114ffa5-d3de-48b6-8bdd-9280a3e1adc8/volumes" Oct 09 14:05:17 crc kubenswrapper[4902]: I1009 14:05:17.464257 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 09 14:05:17 crc kubenswrapper[4902]: I1009 14:05:17.464794 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 09 14:05:17 crc kubenswrapper[4902]: I1009 14:05:17.564042 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 09 14:05:17 crc kubenswrapper[4902]: I1009 14:05:17.564102 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 09 14:05:18 crc kubenswrapper[4902]: I1009 14:05:18.829593 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: connect: connection refused" Oct 09 14:05:19 crc kubenswrapper[4902]: I1009 14:05:19.930746 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:19 crc kubenswrapper[4902]: I1009 14:05:19.935546 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 14:05:19 crc kubenswrapper[4902]: I1009 14:05:19.993725 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:05:19 crc kubenswrapper[4902]: I1009 14:05:19.995164 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.011548 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.011643 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.011735 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.011786 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxmk7\" (UniqueName: \"kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.011810 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.037774 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.116291 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.116361 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxmk7\" (UniqueName: \"kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.116384 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.116448 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.116489 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.117502 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.121036 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.121152 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.122030 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.159611 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxmk7\" (UniqueName: \"kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7\") pod \"dnsmasq-dns-698758b865-wmxc2\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.321286 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:20 crc kubenswrapper[4902]: I1009 14:05:20.776908 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.209104 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wmxc2" event={"ID":"3ede4964-25d2-4a68-bfb7-84a9cbf633c6","Type":"ContainerStarted","Data":"a516b8c9fee0f0eb5dcb0da9c294b3296a407efe061e72972121f5c0f8a74957"} Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.240827 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.245748 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.247450 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mncrj" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.247646 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.247785 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.248001 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.282530 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.337421 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.337464 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.337511 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8qf\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-kube-api-access-hx8qf\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.337547 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-cache\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.337605 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-lock\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439103 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439177 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439246 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8qf\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-kube-api-access-hx8qf\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439302 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-cache\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439347 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-lock\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.439343 4902 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.439393 4902 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.439537 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift podName:fa80b7ed-e420-455b-a918-d474c0453547 nodeName:}" failed. No retries permitted until 2025-10-09 14:05:21.939511103 +0000 UTC m=+869.137370237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift") pod "swift-storage-0" (UID: "fa80b7ed-e420-455b-a918-d474c0453547") : configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.439968 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-lock\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.440077 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fa80b7ed-e420-455b-a918-d474c0453547-cache\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.440123 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.457816 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8qf\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-kube-api-access-hx8qf\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.464012 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.735853 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-khn8g"] Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.737111 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.738833 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.739175 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.740296 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.748578 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-khn8g"] Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.844906 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.844974 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.845055 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jr9\" (UniqueName: \"kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.845148 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.845170 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.845214 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.845380 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.946420 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.946687 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.946820 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.946890 4902 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.946937 4902 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.947029 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: E1009 14:05:21.947127 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift podName:fa80b7ed-e420-455b-a918-d474c0453547 nodeName:}" failed. No retries permitted until 2025-10-09 14:05:22.947085439 +0000 UTC m=+870.144944503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift") pod "swift-storage-0" (UID: "fa80b7ed-e420-455b-a918-d474c0453547") : configmap "swift-ring-files" not found Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.947198 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.947319 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jr9\" (UniqueName: \"kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.947504 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.947529 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.948115 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.948152 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.948521 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.951687 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.951913 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.955597 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:21 crc kubenswrapper[4902]: I1009 14:05:21.965519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jr9\" (UniqueName: \"kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9\") pod \"swift-ring-rebalance-khn8g\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:22 crc kubenswrapper[4902]: I1009 14:05:22.056364 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:22 crc kubenswrapper[4902]: I1009 14:05:22.217152 4902 generic.go:334] "Generic (PLEG): container finished" podID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerID="0c755799b9c3517eeee06229684e9846911aa4df66cae67bfef29f4501546772" exitCode=0 Oct 09 14:05:22 crc kubenswrapper[4902]: I1009 14:05:22.217202 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wmxc2" event={"ID":"3ede4964-25d2-4a68-bfb7-84a9cbf633c6","Type":"ContainerDied","Data":"0c755799b9c3517eeee06229684e9846911aa4df66cae67bfef29f4501546772"} Oct 09 14:05:22 crc kubenswrapper[4902]: I1009 14:05:22.310883 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-khn8g"] Oct 09 14:05:22 crc kubenswrapper[4902]: I1009 14:05:22.977834 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:22 crc kubenswrapper[4902]: E1009 14:05:22.978031 4902 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 14:05:22 crc kubenswrapper[4902]: E1009 14:05:22.978054 4902 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 14:05:22 crc kubenswrapper[4902]: E1009 14:05:22.978109 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift podName:fa80b7ed-e420-455b-a918-d474c0453547 nodeName:}" failed. No retries permitted until 2025-10-09 14:05:24.978094518 +0000 UTC m=+872.175953582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift") pod "swift-storage-0" (UID: "fa80b7ed-e420-455b-a918-d474c0453547") : configmap "swift-ring-files" not found Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.010159 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.062114 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="a4dec46d-073a-484c-ba80-0ff939025e48" containerName="galera" probeResult="failure" output=< Oct 09 14:05:23 crc kubenswrapper[4902]: wsrep_local_state_comment (Joined) differs from Synced Oct 09 14:05:23 crc kubenswrapper[4902]: > Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.226646 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wmxc2" event={"ID":"3ede4964-25d2-4a68-bfb7-84a9cbf633c6","Type":"ContainerStarted","Data":"daf0bc33851e3c2bf26b5abf9ac72b5d98000f91eb1b31bd7e64127a8682505f"} Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.227017 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.228038 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-khn8g" event={"ID":"014a8355-9817-424e-ae75-b786043b2a4c","Type":"ContainerStarted","Data":"387c81eb98f365803f5f3f301211c670f510118ef020384b9e4a4ec9d3732036"} Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.250052 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-wmxc2" podStartSLOduration=4.25003052 podStartE2EDuration="4.25003052s" podCreationTimestamp="2025-10-09 14:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:05:23.245231941 +0000 UTC m=+870.443091015" watchObservedRunningTime="2025-10-09 14:05:23.25003052 +0000 UTC m=+870.447889584" Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.830385 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: connect: connection refused" Oct 09 14:05:23 crc kubenswrapper[4902]: I1009 14:05:23.830923 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.012109 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.012333 4902 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.012385 4902 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.012472 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift podName:fa80b7ed-e420-455b-a918-d474c0453547 nodeName:}" failed. No retries permitted until 2025-10-09 14:05:29.012451029 +0000 UTC m=+876.210310093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift") pod "swift-storage-0" (UID: "fa80b7ed-e420-455b-a918-d474c0453547") : configmap "swift-ring-files" not found Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.163598 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3889733373/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified" Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.163889 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-northd,Image:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,Command:[/usr/bin/ovn-northd],Args:[-vfile:off -vconsole:info --n-threads=1 --ovnnb-db=ssl:ovsdbserver-nb-0.openstack.svc.cluster.local:6641 --ovnsb-db=ssl:ovsdbserver-sb-0.openstack.svc.cluster.local:6642 --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:certs,Value:n6bh558h55h649hc8h6h646h5d9h589h678h6h5d9h688h5d7h56hd6h5bdh659h574h64fh5d6h665h5dh55fh9ch5d4h57fhbch98h5b5h5fh684q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-config,Value:n5c8h7ch56bh8dh8hc4h5dch9dh68h6bhb7h598h549h5dbh66fh6bh5b4h5cch5d6h55ch57fhfch588h89h5ddh5d6h65bh65bh8dhc4h67dh569q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-scripts,Value:n664hd8h66ch58dh64hc9h66bhd4h558h697h67bh557hdch664h567h669h555h696h556h556h5fh5bh569hbh665h9dh4h9bh564hc8h5b7h5c4q,ValueFrom:nil,},EnvVar{Name:tls-ca-bundle.pem,Value:n7fh74h598h5d7h64ch557h5ffh5c6h5f7hb4h6dh7bh7dh685h694h599h5cdh5b4h667h567h586h57bh598h85h666h684h68fh5c9h544h5dh669h65bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4hj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-northd-0_openstack(22ef2931-973d-462a-ae3a-d05056c72468): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3889733373/1\": happened during read: context canceled" logger="UnhandledError" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.243689 4902 generic.go:334] "Generic (PLEG): container finished" podID="f645f372-beaf-4a39-9957-18b18a365706" containerID="a2d4f01be9e4c3ed645bae50e06975c168a423d91a47ba1579d9d7df8ee5a594" exitCode=0 Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.243831 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" event={"ID":"f645f372-beaf-4a39-9957-18b18a365706","Type":"ContainerDied","Data":"a2d4f01be9e4c3ed645bae50e06975c168a423d91a47ba1579d9d7df8ee5a594"} Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.246034 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pm8hp" event={"ID":"aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb","Type":"ContainerStarted","Data":"3e17309a3603c5eeaca85f06d4098d827d72df9c789de3489dfd5ededadefb43"} Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.304586 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pm8hp" podStartSLOduration=12.304563925 podStartE2EDuration="12.304563925s" podCreationTimestamp="2025-10-09 14:05:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:05:25.297031592 +0000 UTC m=+872.494890656" watchObservedRunningTime="2025-10-09 14:05:25.304563925 +0000 UTC m=+872.502422989" Oct 09 14:05:25 crc kubenswrapper[4902]: E1009 14:05:25.515531 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3889733373/1\\\": happened during read: context canceled\"" pod="openstack/ovn-northd-0" podUID="22ef2931-973d-462a-ae3a-d05056c72468" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.582044 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.593005 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.645853 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.656376 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.725039 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z96fc\" (UniqueName: \"kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc\") pod \"ad8edf23-ca42-4f81-a01a-4bd897f23934\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.725239 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config\") pod \"ad8edf23-ca42-4f81-a01a-4bd897f23934\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.725290 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc\") pod \"ad8edf23-ca42-4f81-a01a-4bd897f23934\" (UID: \"ad8edf23-ca42-4f81-a01a-4bd897f23934\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.731061 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc" (OuterVolumeSpecName: "kube-api-access-z96fc") pod "ad8edf23-ca42-4f81-a01a-4bd897f23934" (UID: "ad8edf23-ca42-4f81-a01a-4bd897f23934"). InnerVolumeSpecName "kube-api-access-z96fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.767266 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config" (OuterVolumeSpecName: "config") pod "ad8edf23-ca42-4f81-a01a-4bd897f23934" (UID: "ad8edf23-ca42-4f81-a01a-4bd897f23934"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.770827 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad8edf23-ca42-4f81-a01a-4bd897f23934" (UID: "ad8edf23-ca42-4f81-a01a-4bd897f23934"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.829170 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc\") pod \"f645f372-beaf-4a39-9957-18b18a365706\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.829248 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb\") pod \"f645f372-beaf-4a39-9957-18b18a365706\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.829347 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb\") pod \"f645f372-beaf-4a39-9957-18b18a365706\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.829513 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrvgd\" (UniqueName: \"kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd\") pod \"f645f372-beaf-4a39-9957-18b18a365706\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.829563 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config\") pod \"f645f372-beaf-4a39-9957-18b18a365706\" (UID: \"f645f372-beaf-4a39-9957-18b18a365706\") " Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.830023 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z96fc\" (UniqueName: \"kubernetes.io/projected/ad8edf23-ca42-4f81-a01a-4bd897f23934-kube-api-access-z96fc\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.830040 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.830052 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad8edf23-ca42-4f81-a01a-4bd897f23934-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.833107 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd" (OuterVolumeSpecName: "kube-api-access-wrvgd") pod "f645f372-beaf-4a39-9957-18b18a365706" (UID: "f645f372-beaf-4a39-9957-18b18a365706"). InnerVolumeSpecName "kube-api-access-wrvgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.849256 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f645f372-beaf-4a39-9957-18b18a365706" (UID: "f645f372-beaf-4a39-9957-18b18a365706"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.850777 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f645f372-beaf-4a39-9957-18b18a365706" (UID: "f645f372-beaf-4a39-9957-18b18a365706"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.857329 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f645f372-beaf-4a39-9957-18b18a365706" (UID: "f645f372-beaf-4a39-9957-18b18a365706"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.858692 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config" (OuterVolumeSpecName: "config") pod "f645f372-beaf-4a39-9957-18b18a365706" (UID: "f645f372-beaf-4a39-9957-18b18a365706"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.932348 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.932402 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrvgd\" (UniqueName: \"kubernetes.io/projected/f645f372-beaf-4a39-9957-18b18a365706-kube-api-access-wrvgd\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.932430 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.932440 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:25 crc kubenswrapper[4902]: I1009 14:05:25.932448 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f645f372-beaf-4a39-9957-18b18a365706-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.254961 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.254957 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vtncl" event={"ID":"ad8edf23-ca42-4f81-a01a-4bd897f23934","Type":"ContainerDied","Data":"1da187a1d8a4506dcb0ea8da2856bd47aab43e6a02dc078aa036caf218aeec71"} Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.255094 4902 scope.go:117] "RemoveContainer" containerID="49c1a49e4283c9bb8aaf2840f43fc5ee05b89cf0d30d9ecae8b0ba2e9b75880c" Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.257774 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22ef2931-973d-462a-ae3a-d05056c72468","Type":"ContainerStarted","Data":"2097da024f6a402a085e9e8f6c875be8501652d092c74fa4b5bea8585aacb43d"} Oct 09 14:05:26 crc kubenswrapper[4902]: E1009 14:05:26.259039 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="22ef2931-973d-462a-ae3a-d05056c72468" Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.262040 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" event={"ID":"f645f372-beaf-4a39-9957-18b18a365706","Type":"ContainerDied","Data":"4504df46e88dcb3adf28cc5f0092091947c0cee2b23cbdf3666350c72b0945bc"} Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.262113 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rtbbp" Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.327722 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.334707 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vtncl"] Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.351870 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.358901 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rtbbp"] Oct 09 14:05:26 crc kubenswrapper[4902]: I1009 14:05:26.981512 4902 scope.go:117] "RemoveContainer" containerID="b716cb8932cc43dace3456c1b09e41e873eddb43657fdc87e66f8e80486f5d1f" Oct 09 14:05:27 crc kubenswrapper[4902]: E1009 14:05:27.273794 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="22ef2931-973d-462a-ae3a-d05056c72468" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.523660 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" path="/var/lib/kubelet/pods/ad8edf23-ca42-4f81-a01a-4bd897f23934/volumes" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.524509 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f645f372-beaf-4a39-9957-18b18a365706" path="/var/lib/kubelet/pods/f645f372-beaf-4a39-9957-18b18a365706/volumes" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.605514 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.786558 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sg4gd"] Oct 09 14:05:27 crc kubenswrapper[4902]: E1009 14:05:27.786974 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="init" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.786992 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="init" Oct 09 14:05:27 crc kubenswrapper[4902]: E1009 14:05:27.787024 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.787034 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" Oct 09 14:05:27 crc kubenswrapper[4902]: E1009 14:05:27.787047 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f645f372-beaf-4a39-9957-18b18a365706" containerName="init" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.787054 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f645f372-beaf-4a39-9957-18b18a365706" containerName="init" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.787244 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f645f372-beaf-4a39-9957-18b18a365706" containerName="init" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.787272 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8edf23-ca42-4f81-a01a-4bd897f23934" containerName="dnsmasq-dns" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.787969 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.801937 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sg4gd"] Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.979167 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptt94\" (UniqueName: \"kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94\") pod \"keystone-db-create-sg4gd\" (UID: \"53238bce-0e1a-4c67-b5eb-ecc5387d41cf\") " pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.981257 4902 scope.go:117] "RemoveContainer" containerID="a2d4f01be9e4c3ed645bae50e06975c168a423d91a47ba1579d9d7df8ee5a594" Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.987893 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kw64n"] Oct 09 14:05:27 crc kubenswrapper[4902]: I1009 14:05:27.989160 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kw64n" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:27.999254 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kw64n"] Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.080777 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptt94\" (UniqueName: \"kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94\") pod \"keystone-db-create-sg4gd\" (UID: \"53238bce-0e1a-4c67-b5eb-ecc5387d41cf\") " pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.102323 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptt94\" (UniqueName: \"kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94\") pod \"keystone-db-create-sg4gd\" (UID: \"53238bce-0e1a-4c67-b5eb-ecc5387d41cf\") " pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.110378 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.182191 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drsz\" (UniqueName: \"kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz\") pod \"placement-db-create-kw64n\" (UID: \"e9ffdab1-dd2a-40c5-b12c-e5018656324a\") " pod="openstack/placement-db-create-kw64n" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.286337 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2drsz\" (UniqueName: \"kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz\") pod \"placement-db-create-kw64n\" (UID: \"e9ffdab1-dd2a-40c5-b12c-e5018656324a\") " pod="openstack/placement-db-create-kw64n" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.291062 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-khn8g" event={"ID":"014a8355-9817-424e-ae75-b786043b2a4c","Type":"ContainerStarted","Data":"3d22d59cf13256e7c9fd4b4582de39b42050e52d6f3afd36d727ae5d2c2127de"} Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.310179 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drsz\" (UniqueName: \"kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz\") pod \"placement-db-create-kw64n\" (UID: \"e9ffdab1-dd2a-40c5-b12c-e5018656324a\") " pod="openstack/placement-db-create-kw64n" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.468877 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kw64n" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.583724 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-khn8g" podStartSLOduration=1.8946918780000002 podStartE2EDuration="7.583677417s" podCreationTimestamp="2025-10-09 14:05:21 +0000 UTC" firstStartedPulling="2025-10-09 14:05:22.33137781 +0000 UTC m=+869.529236874" lastFinishedPulling="2025-10-09 14:05:28.020363339 +0000 UTC m=+875.218222413" observedRunningTime="2025-10-09 14:05:28.341740314 +0000 UTC m=+875.539599378" watchObservedRunningTime="2025-10-09 14:05:28.583677417 +0000 UTC m=+875.781536481" Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.584551 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sg4gd"] Oct 09 14:05:28 crc kubenswrapper[4902]: W1009 14:05:28.591055 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53238bce_0e1a_4c67_b5eb_ecc5387d41cf.slice/crio-4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4 WatchSource:0}: Error finding container 4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4: Status 404 returned error can't find the container with id 4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4 Oct 09 14:05:28 crc kubenswrapper[4902]: I1009 14:05:28.881039 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kw64n"] Oct 09 14:05:28 crc kubenswrapper[4902]: W1009 14:05:28.882180 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ffdab1_dd2a_40c5_b12c_e5018656324a.slice/crio-2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b WatchSource:0}: Error finding container 2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b: Status 404 returned error can't find the container with id 2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.105074 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:29 crc kubenswrapper[4902]: E1009 14:05:29.105204 4902 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 14:05:29 crc kubenswrapper[4902]: E1009 14:05:29.105401 4902 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 14:05:29 crc kubenswrapper[4902]: E1009 14:05:29.105497 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift podName:fa80b7ed-e420-455b-a918-d474c0453547 nodeName:}" failed. No retries permitted until 2025-10-09 14:05:37.105475486 +0000 UTC m=+884.303334550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift") pod "swift-storage-0" (UID: "fa80b7ed-e420-455b-a918-d474c0453547") : configmap "swift-ring-files" not found Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.313522 4902 generic.go:334] "Generic (PLEG): container finished" podID="e9ffdab1-dd2a-40c5-b12c-e5018656324a" containerID="d3d45dde663256231b9c482a7f7f17a7468937d327a40c16ae673923a4da10ec" exitCode=0 Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.313703 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kw64n" event={"ID":"e9ffdab1-dd2a-40c5-b12c-e5018656324a","Type":"ContainerDied","Data":"d3d45dde663256231b9c482a7f7f17a7468937d327a40c16ae673923a4da10ec"} Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.314793 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kw64n" event={"ID":"e9ffdab1-dd2a-40c5-b12c-e5018656324a","Type":"ContainerStarted","Data":"2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b"} Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.317355 4902 generic.go:334] "Generic (PLEG): container finished" podID="53238bce-0e1a-4c67-b5eb-ecc5387d41cf" containerID="1377593959dcd4fb8c1f19f71b7c020997bdc7f2a2df909808f0076f1a0248f4" exitCode=0 Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.317490 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sg4gd" event={"ID":"53238bce-0e1a-4c67-b5eb-ecc5387d41cf","Type":"ContainerDied","Data":"1377593959dcd4fb8c1f19f71b7c020997bdc7f2a2df909808f0076f1a0248f4"} Oct 09 14:05:29 crc kubenswrapper[4902]: I1009 14:05:29.317520 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sg4gd" event={"ID":"53238bce-0e1a-4c67-b5eb-ecc5387d41cf","Type":"ContainerStarted","Data":"4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4"} Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.323619 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.386512 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.387569 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="dnsmasq-dns" containerID="cri-o://a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c" gracePeriod=10 Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.691587 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.801383 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kw64n" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.835190 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptt94\" (UniqueName: \"kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94\") pod \"53238bce-0e1a-4c67-b5eb-ecc5387d41cf\" (UID: \"53238bce-0e1a-4c67-b5eb-ecc5387d41cf\") " Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.843365 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94" (OuterVolumeSpecName: "kube-api-access-ptt94") pod "53238bce-0e1a-4c67-b5eb-ecc5387d41cf" (UID: "53238bce-0e1a-4c67-b5eb-ecc5387d41cf"). InnerVolumeSpecName "kube-api-access-ptt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.887615 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.936391 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2drsz\" (UniqueName: \"kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz\") pod \"e9ffdab1-dd2a-40c5-b12c-e5018656324a\" (UID: \"e9ffdab1-dd2a-40c5-b12c-e5018656324a\") " Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.936859 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptt94\" (UniqueName: \"kubernetes.io/projected/53238bce-0e1a-4c67-b5eb-ecc5387d41cf-kube-api-access-ptt94\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:30 crc kubenswrapper[4902]: I1009 14:05:30.944704 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz" (OuterVolumeSpecName: "kube-api-access-2drsz") pod "e9ffdab1-dd2a-40c5-b12c-e5018656324a" (UID: "e9ffdab1-dd2a-40c5-b12c-e5018656324a"). InnerVolumeSpecName "kube-api-access-2drsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.037921 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc\") pod \"75f07b5d-6f2c-473f-ae50-725bcdf084af\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.037967 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjh76\" (UniqueName: \"kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76\") pod \"75f07b5d-6f2c-473f-ae50-725bcdf084af\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.038111 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config\") pod \"75f07b5d-6f2c-473f-ae50-725bcdf084af\" (UID: \"75f07b5d-6f2c-473f-ae50-725bcdf084af\") " Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.038652 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2drsz\" (UniqueName: \"kubernetes.io/projected/e9ffdab1-dd2a-40c5-b12c-e5018656324a-kube-api-access-2drsz\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.041473 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76" (OuterVolumeSpecName: "kube-api-access-bjh76") pod "75f07b5d-6f2c-473f-ae50-725bcdf084af" (UID: "75f07b5d-6f2c-473f-ae50-725bcdf084af"). InnerVolumeSpecName "kube-api-access-bjh76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.079560 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config" (OuterVolumeSpecName: "config") pod "75f07b5d-6f2c-473f-ae50-725bcdf084af" (UID: "75f07b5d-6f2c-473f-ae50-725bcdf084af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.081537 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75f07b5d-6f2c-473f-ae50-725bcdf084af" (UID: "75f07b5d-6f2c-473f-ae50-725bcdf084af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.139663 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.139696 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjh76\" (UniqueName: \"kubernetes.io/projected/75f07b5d-6f2c-473f-ae50-725bcdf084af-kube-api-access-bjh76\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.139707 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f07b5d-6f2c-473f-ae50-725bcdf084af-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.334010 4902 generic.go:334] "Generic (PLEG): container finished" podID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerID="a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c" exitCode=0 Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.334092 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" event={"ID":"75f07b5d-6f2c-473f-ae50-725bcdf084af","Type":"ContainerDied","Data":"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c"} Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.334127 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" event={"ID":"75f07b5d-6f2c-473f-ae50-725bcdf084af","Type":"ContainerDied","Data":"f91533ce29f955c7f932d405082cade5284aad50093ec1c5ef7132353298d937"} Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.334147 4902 scope.go:117] "RemoveContainer" containerID="a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.334285 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rgtzl" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.338571 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kw64n" event={"ID":"e9ffdab1-dd2a-40c5-b12c-e5018656324a","Type":"ContainerDied","Data":"2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b"} Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.338611 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dae98bdad6cadaf4e3c81c983565cee1cfb59130c39fb386d2557b8aa44034b" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.338666 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kw64n" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.347482 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sg4gd" event={"ID":"53238bce-0e1a-4c67-b5eb-ecc5387d41cf","Type":"ContainerDied","Data":"4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4"} Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.347532 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fdf213ad895c02922183918d27c3e9021d495a4b1c29048cf2c411bfab578e4" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.347592 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sg4gd" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.354873 4902 scope.go:117] "RemoveContainer" containerID="c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.376842 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.381629 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rgtzl"] Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.384174 4902 scope.go:117] "RemoveContainer" containerID="a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c" Oct 09 14:05:31 crc kubenswrapper[4902]: E1009 14:05:31.386444 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c\": container with ID starting with a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c not found: ID does not exist" containerID="a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.386502 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c"} err="failed to get container status \"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c\": rpc error: code = NotFound desc = could not find container \"a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c\": container with ID starting with a813e77d238b372ae7d81afb177521ae8af473a236e6912c56737c7d3e16d74c not found: ID does not exist" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.386535 4902 scope.go:117] "RemoveContainer" containerID="c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8" Oct 09 14:05:31 crc kubenswrapper[4902]: E1009 14:05:31.386900 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8\": container with ID starting with c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8 not found: ID does not exist" containerID="c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.386921 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8"} err="failed to get container status \"c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8\": rpc error: code = NotFound desc = could not find container \"c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8\": container with ID starting with c8b559a2ebfda50821f5b896d48aaeaabd9e840cbcfb1b8c6de690c245f559e8 not found: ID does not exist" Oct 09 14:05:31 crc kubenswrapper[4902]: I1009 14:05:31.523915 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" path="/var/lib/kubelet/pods/75f07b5d-6f2c-473f-ae50-725bcdf084af/volumes" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.279428 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n2lrz"] Oct 09 14:05:33 crc kubenswrapper[4902]: E1009 14:05:33.279791 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53238bce-0e1a-4c67-b5eb-ecc5387d41cf" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.279806 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="53238bce-0e1a-4c67-b5eb-ecc5387d41cf" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: E1009 14:05:33.279816 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ffdab1-dd2a-40c5-b12c-e5018656324a" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.279822 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ffdab1-dd2a-40c5-b12c-e5018656324a" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: E1009 14:05:33.279839 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="init" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.279845 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="init" Oct 09 14:05:33 crc kubenswrapper[4902]: E1009 14:05:33.279857 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="dnsmasq-dns" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.279863 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="dnsmasq-dns" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.280035 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="53238bce-0e1a-4c67-b5eb-ecc5387d41cf" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.280049 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ffdab1-dd2a-40c5-b12c-e5018656324a" containerName="mariadb-database-create" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.280056 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f07b5d-6f2c-473f-ae50-725bcdf084af" containerName="dnsmasq-dns" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.280654 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.309203 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n2lrz"] Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.387006 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gglfb\" (UniqueName: \"kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb\") pod \"glance-db-create-n2lrz\" (UID: \"f2bd9da0-075a-4827-90bc-72c879d60820\") " pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.489004 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gglfb\" (UniqueName: \"kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb\") pod \"glance-db-create-n2lrz\" (UID: \"f2bd9da0-075a-4827-90bc-72c879d60820\") " pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.538203 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gglfb\" (UniqueName: \"kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb\") pod \"glance-db-create-n2lrz\" (UID: \"f2bd9da0-075a-4827-90bc-72c879d60820\") " pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:33 crc kubenswrapper[4902]: I1009 14:05:33.604117 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:34 crc kubenswrapper[4902]: I1009 14:05:34.025990 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n2lrz"] Oct 09 14:05:34 crc kubenswrapper[4902]: I1009 14:05:34.371848 4902 generic.go:334] "Generic (PLEG): container finished" podID="f2bd9da0-075a-4827-90bc-72c879d60820" containerID="9a1fd6740cd26069d16c76e18dde3d6c92ef84e9731d12bd6aa3e2ddb25e3e55" exitCode=0 Oct 09 14:05:34 crc kubenswrapper[4902]: I1009 14:05:34.371905 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2lrz" event={"ID":"f2bd9da0-075a-4827-90bc-72c879d60820","Type":"ContainerDied","Data":"9a1fd6740cd26069d16c76e18dde3d6c92ef84e9731d12bd6aa3e2ddb25e3e55"} Oct 09 14:05:34 crc kubenswrapper[4902]: I1009 14:05:34.371940 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2lrz" event={"ID":"f2bd9da0-075a-4827-90bc-72c879d60820","Type":"ContainerStarted","Data":"5b84670bd410e5baadbd706abad3deb61284215e99d151cd898bd73fe3c0d8f7"} Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.378567 4902 generic.go:334] "Generic (PLEG): container finished" podID="014a8355-9817-424e-ae75-b786043b2a4c" containerID="3d22d59cf13256e7c9fd4b4582de39b42050e52d6f3afd36d727ae5d2c2127de" exitCode=0 Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.378647 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-khn8g" event={"ID":"014a8355-9817-424e-ae75-b786043b2a4c","Type":"ContainerDied","Data":"3d22d59cf13256e7c9fd4b4582de39b42050e52d6f3afd36d727ae5d2c2127de"} Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.651756 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.845167 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gglfb\" (UniqueName: \"kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb\") pod \"f2bd9da0-075a-4827-90bc-72c879d60820\" (UID: \"f2bd9da0-075a-4827-90bc-72c879d60820\") " Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.853133 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb" (OuterVolumeSpecName: "kube-api-access-gglfb") pod "f2bd9da0-075a-4827-90bc-72c879d60820" (UID: "f2bd9da0-075a-4827-90bc-72c879d60820"). InnerVolumeSpecName "kube-api-access-gglfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:35 crc kubenswrapper[4902]: I1009 14:05:35.947094 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gglfb\" (UniqueName: \"kubernetes.io/projected/f2bd9da0-075a-4827-90bc-72c879d60820-kube-api-access-gglfb\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.387584 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n2lrz" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.387588 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n2lrz" event={"ID":"f2bd9da0-075a-4827-90bc-72c879d60820","Type":"ContainerDied","Data":"5b84670bd410e5baadbd706abad3deb61284215e99d151cd898bd73fe3c0d8f7"} Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.387634 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b84670bd410e5baadbd706abad3deb61284215e99d151cd898bd73fe3c0d8f7" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.702741 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.861669 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.862111 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.862355 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.862720 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.862908 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.862966 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.863159 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8jr9\" (UniqueName: \"kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.863262 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.863314 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts\") pod \"014a8355-9817-424e-ae75-b786043b2a4c\" (UID: \"014a8355-9817-424e-ae75-b786043b2a4c\") " Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.863766 4902 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.863791 4902 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/014a8355-9817-424e-ae75-b786043b2a4c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.871971 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9" (OuterVolumeSpecName: "kube-api-access-v8jr9") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "kube-api-access-v8jr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.874238 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.881532 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts" (OuterVolumeSpecName: "scripts") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.884704 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.888193 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "014a8355-9817-424e-ae75-b786043b2a4c" (UID: "014a8355-9817-424e-ae75-b786043b2a4c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.966878 4902 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.966934 4902 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.966953 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014a8355-9817-424e-ae75-b786043b2a4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.966972 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8jr9\" (UniqueName: \"kubernetes.io/projected/014a8355-9817-424e-ae75-b786043b2a4c-kube-api-access-v8jr9\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:36 crc kubenswrapper[4902]: I1009 14:05:36.966991 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/014a8355-9817-424e-ae75-b786043b2a4c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.171025 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.178779 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fa80b7ed-e420-455b-a918-d474c0453547-etc-swift\") pod \"swift-storage-0\" (UID: \"fa80b7ed-e420-455b-a918-d474c0453547\") " pod="openstack/swift-storage-0" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.185331 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.397347 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-khn8g" event={"ID":"014a8355-9817-424e-ae75-b786043b2a4c","Type":"ContainerDied","Data":"387c81eb98f365803f5f3f301211c670f510118ef020384b9e4a4ec9d3732036"} Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.397631 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="387c81eb98f365803f5f3f301211c670f510118ef020384b9e4a4ec9d3732036" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.397436 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-khn8g" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.735605 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.818357 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-07af-account-create-xjlvw"] Oct 09 14:05:37 crc kubenswrapper[4902]: E1009 14:05:37.818746 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014a8355-9817-424e-ae75-b786043b2a4c" containerName="swift-ring-rebalance" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.818770 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="014a8355-9817-424e-ae75-b786043b2a4c" containerName="swift-ring-rebalance" Oct 09 14:05:37 crc kubenswrapper[4902]: E1009 14:05:37.818784 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bd9da0-075a-4827-90bc-72c879d60820" containerName="mariadb-database-create" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.818792 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bd9da0-075a-4827-90bc-72c879d60820" containerName="mariadb-database-create" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.818994 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="014a8355-9817-424e-ae75-b786043b2a4c" containerName="swift-ring-rebalance" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.819014 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bd9da0-075a-4827-90bc-72c879d60820" containerName="mariadb-database-create" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.819651 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.828335 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-07af-account-create-xjlvw"] Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.829348 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 09 14:05:37 crc kubenswrapper[4902]: I1009 14:05:37.980815 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhfd\" (UniqueName: \"kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd\") pod \"keystone-07af-account-create-xjlvw\" (UID: \"ed26f964-2a36-43d7-80f5-7d7e60a32e4f\") " pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.082767 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhfd\" (UniqueName: \"kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd\") pod \"keystone-07af-account-create-xjlvw\" (UID: \"ed26f964-2a36-43d7-80f5-7d7e60a32e4f\") " pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.100620 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9532-account-create-sznbs"] Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.102702 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.105699 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.122137 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhfd\" (UniqueName: \"kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd\") pod \"keystone-07af-account-create-xjlvw\" (UID: \"ed26f964-2a36-43d7-80f5-7d7e60a32e4f\") " pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.124656 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9532-account-create-sznbs"] Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.147953 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.285839 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnz6\" (UniqueName: \"kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6\") pod \"placement-9532-account-create-sznbs\" (UID: \"81276c5a-6547-4625-b10e-edb12dc107b9\") " pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.387793 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnz6\" (UniqueName: \"kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6\") pod \"placement-9532-account-create-sznbs\" (UID: \"81276c5a-6547-4625-b10e-edb12dc107b9\") " pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.406788 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnz6\" (UniqueName: \"kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6\") pod \"placement-9532-account-create-sznbs\" (UID: \"81276c5a-6547-4625-b10e-edb12dc107b9\") " pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.410884 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"f6bca2a1adde1867fb5607598ac95e5ac452224eb33d11ca0cc4c204c0ee593f"} Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.454203 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.569835 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-07af-account-create-xjlvw"] Oct 09 14:05:38 crc kubenswrapper[4902]: W1009 14:05:38.572822 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded26f964_2a36_43d7_80f5_7d7e60a32e4f.slice/crio-416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e WatchSource:0}: Error finding container 416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e: Status 404 returned error can't find the container with id 416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.715129 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9532-account-create-sznbs"] Oct 09 14:05:38 crc kubenswrapper[4902]: W1009 14:05:38.718497 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81276c5a_6547_4625_b10e_edb12dc107b9.slice/crio-7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6 WatchSource:0}: Error finding container 7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6: Status 404 returned error can't find the container with id 7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6 Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.788325 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-79djm" podUID="7f0722a8-eee2-4bb1-a3b4-d14964d35227" containerName="ovn-controller" probeResult="failure" output=< Oct 09 14:05:38 crc kubenswrapper[4902]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 14:05:38 crc kubenswrapper[4902]: > Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.850706 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:05:38 crc kubenswrapper[4902]: I1009 14:05:38.852242 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rlj4x" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.082067 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-79djm-config-ts5sx"] Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.083918 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.087037 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.092085 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79djm-config-ts5sx"] Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202687 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lv6m\" (UniqueName: \"kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202736 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202763 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202826 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202849 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.202870 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304301 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lv6m\" (UniqueName: \"kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304383 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304696 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304758 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304788 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.304810 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.305061 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.305081 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.305122 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.305570 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.306901 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.325966 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lv6m\" (UniqueName: \"kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m\") pod \"ovn-controller-79djm-config-ts5sx\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.412197 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.421168 4902 generic.go:334] "Generic (PLEG): container finished" podID="ed26f964-2a36-43d7-80f5-7d7e60a32e4f" containerID="06973ced0cf4365c382c831e1110cc9681973907c0defd082b9536f8146bd686" exitCode=0 Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.421212 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-07af-account-create-xjlvw" event={"ID":"ed26f964-2a36-43d7-80f5-7d7e60a32e4f","Type":"ContainerDied","Data":"06973ced0cf4365c382c831e1110cc9681973907c0defd082b9536f8146bd686"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.421257 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-07af-account-create-xjlvw" event={"ID":"ed26f964-2a36-43d7-80f5-7d7e60a32e4f","Type":"ContainerStarted","Data":"416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.422905 4902 generic.go:334] "Generic (PLEG): container finished" podID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerID="804388511a83b9bbcfe911cb67ebf26bcf159b46ac05551143b014d995792cd7" exitCode=0 Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.423692 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerDied","Data":"804388511a83b9bbcfe911cb67ebf26bcf159b46ac05551143b014d995792cd7"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.429510 4902 generic.go:334] "Generic (PLEG): container finished" podID="c9c6af38-1605-4d47-bc0c-967053235667" containerID="c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab" exitCode=0 Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.429615 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerDied","Data":"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.445264 4902 generic.go:334] "Generic (PLEG): container finished" podID="81276c5a-6547-4625-b10e-edb12dc107b9" containerID="ee60c0a5698fea52999638b4ce5cd607a71c0c72a9b72fa6ea62ebd499ba3540" exitCode=0 Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.445552 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9532-account-create-sznbs" event={"ID":"81276c5a-6547-4625-b10e-edb12dc107b9","Type":"ContainerDied","Data":"ee60c0a5698fea52999638b4ce5cd607a71c0c72a9b72fa6ea62ebd499ba3540"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.445620 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9532-account-create-sznbs" event={"ID":"81276c5a-6547-4625-b10e-edb12dc107b9","Type":"ContainerStarted","Data":"7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6"} Oct 09 14:05:39 crc kubenswrapper[4902]: I1009 14:05:39.874442 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79djm-config-ts5sx"] Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.471154 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerStarted","Data":"d4e54c02be4739d4a0a45ec73f982c9c598bfcc7dbb11b79b2c4d20783e30929"} Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.471444 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.476590 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerStarted","Data":"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39"} Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.476951 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.479047 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"0c54edaeadd1f83cd617bdd503256a7846491dfd2f89898e9894f1162e767a5d"} Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.479154 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"a1949fbadf368440ce28520e3195365ec68c43fdae3dd7c6907d74fa8ad5bfca"} Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.505385 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.861398509 podStartE2EDuration="57.505368986s" podCreationTimestamp="2025-10-09 14:04:43 +0000 UTC" firstStartedPulling="2025-10-09 14:04:55.105628187 +0000 UTC m=+842.303487251" lastFinishedPulling="2025-10-09 14:05:01.749598664 +0000 UTC m=+848.947457728" observedRunningTime="2025-10-09 14:05:40.503902546 +0000 UTC m=+887.701761630" watchObservedRunningTime="2025-10-09 14:05:40.505368986 +0000 UTC m=+887.703228060" Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.530270 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.037335205 podStartE2EDuration="57.530218116s" podCreationTimestamp="2025-10-09 14:04:43 +0000 UTC" firstStartedPulling="2025-10-09 14:04:56.702616905 +0000 UTC m=+843.900475969" lastFinishedPulling="2025-10-09 14:05:02.195499816 +0000 UTC m=+849.393358880" observedRunningTime="2025-10-09 14:05:40.525695414 +0000 UTC m=+887.723554478" watchObservedRunningTime="2025-10-09 14:05:40.530218116 +0000 UTC m=+887.728077190" Oct 09 14:05:40 crc kubenswrapper[4902]: W1009 14:05:40.816453 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod462a635a_0dd6_4203_b2b9_9279aa13d40d.slice/crio-c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847 WatchSource:0}: Error finding container c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847: Status 404 returned error can't find the container with id c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847 Oct 09 14:05:40 crc kubenswrapper[4902]: I1009 14:05:40.991005 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.033174 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.139779 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlnz6\" (UniqueName: \"kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6\") pod \"81276c5a-6547-4625-b10e-edb12dc107b9\" (UID: \"81276c5a-6547-4625-b10e-edb12dc107b9\") " Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.139859 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbhfd\" (UniqueName: \"kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd\") pod \"ed26f964-2a36-43d7-80f5-7d7e60a32e4f\" (UID: \"ed26f964-2a36-43d7-80f5-7d7e60a32e4f\") " Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.147047 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd" (OuterVolumeSpecName: "kube-api-access-cbhfd") pod "ed26f964-2a36-43d7-80f5-7d7e60a32e4f" (UID: "ed26f964-2a36-43d7-80f5-7d7e60a32e4f"). InnerVolumeSpecName "kube-api-access-cbhfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.147103 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6" (OuterVolumeSpecName: "kube-api-access-tlnz6") pod "81276c5a-6547-4625-b10e-edb12dc107b9" (UID: "81276c5a-6547-4625-b10e-edb12dc107b9"). InnerVolumeSpecName "kube-api-access-tlnz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.242172 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlnz6\" (UniqueName: \"kubernetes.io/projected/81276c5a-6547-4625-b10e-edb12dc107b9-kube-api-access-tlnz6\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.242211 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbhfd\" (UniqueName: \"kubernetes.io/projected/ed26f964-2a36-43d7-80f5-7d7e60a32e4f-kube-api-access-cbhfd\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.485958 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9532-account-create-sznbs" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.485952 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9532-account-create-sznbs" event={"ID":"81276c5a-6547-4625-b10e-edb12dc107b9","Type":"ContainerDied","Data":"7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.486303 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf45f5a41d4abf8ee835d883d709f0dcc217a00f7d79c22ff8afa62dde430d6" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.486975 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-07af-account-create-xjlvw" event={"ID":"ed26f964-2a36-43d7-80f5-7d7e60a32e4f","Type":"ContainerDied","Data":"416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.487027 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416d290a7ec4bb48cb6e3bb0b26144c8511d288a84977f80758c37f8b1ddc14e" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.487088 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-07af-account-create-xjlvw" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.491573 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22ef2931-973d-462a-ae3a-d05056c72468","Type":"ContainerStarted","Data":"c3379f5c27086f372bc50455b4f7e95f0a13b2bddae71ce864140cc470485403"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.491882 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.493060 4902 generic.go:334] "Generic (PLEG): container finished" podID="462a635a-0dd6-4203-b2b9-9279aa13d40d" containerID="94da1deaed671e28bce767f1e58bd75e41d9ad66cb40063d63651bf975d4bfb3" exitCode=0 Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.493107 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79djm-config-ts5sx" event={"ID":"462a635a-0dd6-4203-b2b9-9279aa13d40d","Type":"ContainerDied","Data":"94da1deaed671e28bce767f1e58bd75e41d9ad66cb40063d63651bf975d4bfb3"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.493147 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79djm-config-ts5sx" event={"ID":"462a635a-0dd6-4203-b2b9-9279aa13d40d","Type":"ContainerStarted","Data":"c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.495255 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"e51f51e8483e7ed92767f4b029e1e4f18d88a4f957f9dd2c955d07be31532c51"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.495287 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"5386975d9a8ac3a75d3bef7ddff0765e6242937727d1e5e34af4dd9f88101235"} Oct 09 14:05:41 crc kubenswrapper[4902]: I1009 14:05:41.519698 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.308969649 podStartE2EDuration="28.519677324s" podCreationTimestamp="2025-10-09 14:05:13 +0000 UTC" firstStartedPulling="2025-10-09 14:05:14.682793616 +0000 UTC m=+861.880652680" lastFinishedPulling="2025-10-09 14:05:40.893501281 +0000 UTC m=+888.091360355" observedRunningTime="2025-10-09 14:05:41.513830126 +0000 UTC m=+888.711689210" watchObservedRunningTime="2025-10-09 14:05:41.519677324 +0000 UTC m=+888.717536408" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.278241 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376314 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376473 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376495 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376516 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376604 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lv6m\" (UniqueName: \"kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376647 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts\") pod \"462a635a-0dd6-4203-b2b9-9279aa13d40d\" (UID: \"462a635a-0dd6-4203-b2b9-9279aa13d40d\") " Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376682 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run" (OuterVolumeSpecName: "var-run") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.376977 4902 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.377451 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.377484 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.377499 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.378914 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts" (OuterVolumeSpecName: "scripts") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.397734 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m" (OuterVolumeSpecName: "kube-api-access-4lv6m") pod "462a635a-0dd6-4203-b2b9-9279aa13d40d" (UID: "462a635a-0dd6-4203-b2b9-9279aa13d40d"). InnerVolumeSpecName "kube-api-access-4lv6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.413120 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-104a-account-create-zfqcp"] Oct 09 14:05:43 crc kubenswrapper[4902]: E1009 14:05:43.413637 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81276c5a-6547-4625-b10e-edb12dc107b9" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.413738 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="81276c5a-6547-4625-b10e-edb12dc107b9" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: E1009 14:05:43.413824 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462a635a-0dd6-4203-b2b9-9279aa13d40d" containerName="ovn-config" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.413893 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="462a635a-0dd6-4203-b2b9-9279aa13d40d" containerName="ovn-config" Oct 09 14:05:43 crc kubenswrapper[4902]: E1009 14:05:43.413962 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed26f964-2a36-43d7-80f5-7d7e60a32e4f" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.414036 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed26f964-2a36-43d7-80f5-7d7e60a32e4f" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.414313 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed26f964-2a36-43d7-80f5-7d7e60a32e4f" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.414446 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="81276c5a-6547-4625-b10e-edb12dc107b9" containerName="mariadb-account-create" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.414534 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="462a635a-0dd6-4203-b2b9-9279aa13d40d" containerName="ovn-config" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.415163 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.417330 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.428538 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-104a-account-create-zfqcp"] Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478159 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5lw\" (UniqueName: \"kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw\") pod \"glance-104a-account-create-zfqcp\" (UID: \"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41\") " pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478315 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478328 4902 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478337 4902 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462a635a-0dd6-4203-b2b9-9279aa13d40d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478348 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lv6m\" (UniqueName: \"kubernetes.io/projected/462a635a-0dd6-4203-b2b9-9279aa13d40d-kube-api-access-4lv6m\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.478357 4902 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/462a635a-0dd6-4203-b2b9-9279aa13d40d-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.511784 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79djm-config-ts5sx" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.542872 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79djm-config-ts5sx" event={"ID":"462a635a-0dd6-4203-b2b9-9279aa13d40d","Type":"ContainerDied","Data":"c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847"} Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.542907 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a8ac929f70ec3dc194bd236646c39b6b74dc23e8dacee7c628527c11412847" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.599809 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5lw\" (UniqueName: \"kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw\") pod \"glance-104a-account-create-zfqcp\" (UID: \"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41\") " pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.616202 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5lw\" (UniqueName: \"kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw\") pod \"glance-104a-account-create-zfqcp\" (UID: \"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41\") " pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.730188 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:43 crc kubenswrapper[4902]: I1009 14:05:43.820815 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-79djm" Oct 09 14:05:44 crc kubenswrapper[4902]: I1009 14:05:44.094149 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-104a-account-create-zfqcp"] Oct 09 14:05:44 crc kubenswrapper[4902]: W1009 14:05:44.276377 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48eb08a1_1c69_4180_a5aa_75ceb3fd6f41.slice/crio-4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034 WatchSource:0}: Error finding container 4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034: Status 404 returned error can't find the container with id 4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034 Oct 09 14:05:44 crc kubenswrapper[4902]: I1009 14:05:44.424336 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-79djm-config-ts5sx"] Oct 09 14:05:44 crc kubenswrapper[4902]: I1009 14:05:44.429526 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-79djm-config-ts5sx"] Oct 09 14:05:44 crc kubenswrapper[4902]: I1009 14:05:44.521404 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-104a-account-create-zfqcp" event={"ID":"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41","Type":"ContainerStarted","Data":"4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034"} Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.526323 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462a635a-0dd6-4203-b2b9-9279aa13d40d" path="/var/lib/kubelet/pods/462a635a-0dd6-4203-b2b9-9279aa13d40d/volumes" Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.535168 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"40f0a737de12b26394bbb88a53777223c9ca6392be27d0273b010491944d17b0"} Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.535238 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"37f5988f944521498052bac449a25b7af48a04ff04cc269696f3638dbe45f28a"} Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.535254 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"754e649fcdbeae1de7dd52ed1e21506d1e1f8a1b16052004ce3b043daae3a493"} Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.535265 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"63c55b7393fda7e572dca2ed637261566583b65de61d8e53f641a47031582be8"} Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.536876 4902 generic.go:334] "Generic (PLEG): container finished" podID="48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" containerID="03bbc2169c8034cc6260c16e440564a0e39c9784dc7d6f6e9a8ba63d7f106c78" exitCode=0 Oct 09 14:05:45 crc kubenswrapper[4902]: I1009 14:05:45.536918 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-104a-account-create-zfqcp" event={"ID":"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41","Type":"ContainerDied","Data":"03bbc2169c8034cc6260c16e440564a0e39c9784dc7d6f6e9a8ba63d7f106c78"} Oct 09 14:05:46 crc kubenswrapper[4902]: I1009 14:05:46.962390 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.065305 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5lw\" (UniqueName: \"kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw\") pod \"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41\" (UID: \"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41\") " Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.075670 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw" (OuterVolumeSpecName: "kube-api-access-mg5lw") pod "48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" (UID: "48eb08a1-1c69-4180-a5aa-75ceb3fd6f41"). InnerVolumeSpecName "kube-api-access-mg5lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.168017 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5lw\" (UniqueName: \"kubernetes.io/projected/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41-kube-api-access-mg5lw\") on node \"crc\" DevicePath \"\"" Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.646626 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-104a-account-create-zfqcp" event={"ID":"48eb08a1-1c69-4180-a5aa-75ceb3fd6f41","Type":"ContainerDied","Data":"4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034"} Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.646669 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e2fd54c9d26c4cbeab538a6f1efba393f11fc866c1ee87d054edcd1b8ee3034" Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.646730 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-104a-account-create-zfqcp" Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.651366 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"025de8a4637dd678bb7c11e0e2fa591d14be6040db50d10b6e8d9ab5cf173edc"} Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.651404 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"4306362635d5268de18423d0c3af236b10f0028e2fb4cf5040ff226a1535fb51"} Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.651435 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"8a9aa029a88bb974049cf0fbbc09a29f70c37b07a4264cccd9d1acec127b04f2"} Oct 09 14:05:47 crc kubenswrapper[4902]: I1009 14:05:47.651446 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"41d591d623312c2feaf55136a62c0b8bc03506beafb400490329fa3eff0860b6"} Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.555341 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9x8lm"] Oct 09 14:05:48 crc kubenswrapper[4902]: E1009 14:05:48.555957 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" containerName="mariadb-account-create" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.555974 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" containerName="mariadb-account-create" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.556223 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" containerName="mariadb-account-create" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.556927 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.565313 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.565469 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8szfj" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.581150 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9x8lm"] Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.664587 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"9a2c36686a795b1b62b5a8a7d46e601be4eb1a7960994126b631132b4ceb35fe"} Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.664632 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"c3229adf6a840ab64e1a901f7968bfbac9ff7c0d7feafba8cbfa4083153ba5cc"} Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.715977 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.716107 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtwm\" (UniqueName: \"kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.716166 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.716232 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.818092 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.818430 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtwm\" (UniqueName: \"kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.818467 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.818515 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.824826 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.829196 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.837781 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.841930 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtwm\" (UniqueName: \"kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm\") pod \"glance-db-sync-9x8lm\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:48 crc kubenswrapper[4902]: I1009 14:05:48.875220 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9x8lm" Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.445924 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9x8lm"] Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.673522 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9x8lm" event={"ID":"15cef403-03b5-432b-a7b4-c12a613e4d47","Type":"ContainerStarted","Data":"5e5a12e91dcd5529d2d24af7fbbdfd92fe9728762de8ec8c08265930d301a94e"} Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.679526 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fa80b7ed-e420-455b-a918-d474c0453547","Type":"ContainerStarted","Data":"aada12ae87e037a363890280fd0e78220d4d6004af2a63f95ea7368bd239b25e"} Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.726054 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.697895135 podStartE2EDuration="29.726039096s" podCreationTimestamp="2025-10-09 14:05:20 +0000 UTC" firstStartedPulling="2025-10-09 14:05:37.748534154 +0000 UTC m=+884.946393218" lastFinishedPulling="2025-10-09 14:05:46.776678115 +0000 UTC m=+893.974537179" observedRunningTime="2025-10-09 14:05:49.7235806 +0000 UTC m=+896.921439664" watchObservedRunningTime="2025-10-09 14:05:49.726039096 +0000 UTC m=+896.923898160" Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.977281 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.978970 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.981932 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 09 14:05:49 crc kubenswrapper[4902]: I1009 14:05:49.987045 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.078655 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.079039 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138084 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138145 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138184 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138622 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138815 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.138854 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhxv\" (UniqueName: \"kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240172 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240228 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhxv\" (UniqueName: \"kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240285 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240311 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240340 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.240387 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.241264 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.241854 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.242287 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.242429 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.245527 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.263055 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhxv\" (UniqueName: \"kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv\") pod \"dnsmasq-dns-77585f5f8c-wtqhb\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.296119 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:50 crc kubenswrapper[4902]: I1009 14:05:50.739118 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:05:50 crc kubenswrapper[4902]: W1009 14:05:50.759498 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod120030af_2560_4346_b479_fbecbad7c23e.slice/crio-24ed8c3e6528ff836cc0dc6bc94c078c71a21ce577e67027846621788d19a2c6 WatchSource:0}: Error finding container 24ed8c3e6528ff836cc0dc6bc94c078c71a21ce577e67027846621788d19a2c6: Status 404 returned error can't find the container with id 24ed8c3e6528ff836cc0dc6bc94c078c71a21ce577e67027846621788d19a2c6 Oct 09 14:05:51 crc kubenswrapper[4902]: I1009 14:05:51.698118 4902 generic.go:334] "Generic (PLEG): container finished" podID="120030af-2560-4346-b479-fbecbad7c23e" containerID="cc14b467e64783e2792c4a46e15f04af63c08c2a482f554958ad79d0cc4f5bfb" exitCode=0 Oct 09 14:05:51 crc kubenswrapper[4902]: I1009 14:05:51.698650 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" event={"ID":"120030af-2560-4346-b479-fbecbad7c23e","Type":"ContainerDied","Data":"cc14b467e64783e2792c4a46e15f04af63c08c2a482f554958ad79d0cc4f5bfb"} Oct 09 14:05:51 crc kubenswrapper[4902]: I1009 14:05:51.699440 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" event={"ID":"120030af-2560-4346-b479-fbecbad7c23e","Type":"ContainerStarted","Data":"24ed8c3e6528ff836cc0dc6bc94c078c71a21ce577e67027846621788d19a2c6"} Oct 09 14:05:52 crc kubenswrapper[4902]: I1009 14:05:52.710499 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" event={"ID":"120030af-2560-4346-b479-fbecbad7c23e","Type":"ContainerStarted","Data":"c754d8e263589a40ba5c58c0a76717ee00ec54a76618684145984a8f063cbfa1"} Oct 09 14:05:52 crc kubenswrapper[4902]: I1009 14:05:52.710739 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:05:52 crc kubenswrapper[4902]: I1009 14:05:52.733130 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" podStartSLOduration=3.733113045 podStartE2EDuration="3.733113045s" podCreationTimestamp="2025-10-09 14:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:05:52.730518045 +0000 UTC m=+899.928377119" watchObservedRunningTime="2025-10-09 14:05:52.733113045 +0000 UTC m=+899.930972099" Oct 09 14:05:54 crc kubenswrapper[4902]: I1009 14:05:54.210838 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 09 14:05:54 crc kubenswrapper[4902]: I1009 14:05:54.613602 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:05:54 crc kubenswrapper[4902]: I1009 14:05:54.944572 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.522338 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qk2p7"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.524967 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qk2p7" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.543111 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qk2p7"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.636202 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6rs8p"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.637248 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6rs8p" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.652091 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6rs8p"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.667155 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5524\" (UniqueName: \"kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524\") pod \"cinder-db-create-qk2p7\" (UID: \"ca2b155e-e715-4cbd-8d0e-41944380c158\") " pod="openstack/cinder-db-create-qk2p7" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.768228 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5524\" (UniqueName: \"kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524\") pod \"cinder-db-create-qk2p7\" (UID: \"ca2b155e-e715-4cbd-8d0e-41944380c158\") " pod="openstack/cinder-db-create-qk2p7" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.768372 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qs6w\" (UniqueName: \"kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w\") pod \"barbican-db-create-6rs8p\" (UID: \"acda5664-02c5-48ec-92a8-b7f2274d34a7\") " pod="openstack/barbican-db-create-6rs8p" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.799852 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5524\" (UniqueName: \"kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524\") pod \"cinder-db-create-qk2p7\" (UID: \"ca2b155e-e715-4cbd-8d0e-41944380c158\") " pod="openstack/cinder-db-create-qk2p7" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.833541 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9q89d"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.836700 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9q89d" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.847667 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qk2p7" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.851374 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9q89d"] Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.869555 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qs6w\" (UniqueName: \"kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w\") pod \"barbican-db-create-6rs8p\" (UID: \"acda5664-02c5-48ec-92a8-b7f2274d34a7\") " pod="openstack/barbican-db-create-6rs8p" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.893364 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qs6w\" (UniqueName: \"kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w\") pod \"barbican-db-create-6rs8p\" (UID: \"acda5664-02c5-48ec-92a8-b7f2274d34a7\") " pod="openstack/barbican-db-create-6rs8p" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.955769 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6rs8p" Oct 09 14:05:56 crc kubenswrapper[4902]: I1009 14:05:56.971735 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dvr\" (UniqueName: \"kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr\") pod \"neutron-db-create-9q89d\" (UID: \"947813b1-569a-485d-80de-98259162031c\") " pod="openstack/neutron-db-create-9q89d" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.073295 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dvr\" (UniqueName: \"kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr\") pod \"neutron-db-create-9q89d\" (UID: \"947813b1-569a-485d-80de-98259162031c\") " pod="openstack/neutron-db-create-9q89d" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.078212 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xtkdj"] Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.080190 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.087069 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.087185 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.087227 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ws99n" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.087614 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.093567 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xtkdj"] Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.100873 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dvr\" (UniqueName: \"kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr\") pod \"neutron-db-create-9q89d\" (UID: \"947813b1-569a-485d-80de-98259162031c\") " pod="openstack/neutron-db-create-9q89d" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.166652 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9q89d" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.175016 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2btn\" (UniqueName: \"kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.175066 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.175085 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.276987 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2btn\" (UniqueName: \"kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.277055 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.277085 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.285015 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.301652 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2btn\" (UniqueName: \"kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.302756 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle\") pod \"keystone-db-sync-xtkdj\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:05:57 crc kubenswrapper[4902]: I1009 14:05:57.444331 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:06:00 crc kubenswrapper[4902]: I1009 14:06:00.297597 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:06:00 crc kubenswrapper[4902]: I1009 14:06:00.355978 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:06:00 crc kubenswrapper[4902]: I1009 14:06:00.356342 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-wmxc2" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="dnsmasq-dns" containerID="cri-o://daf0bc33851e3c2bf26b5abf9ac72b5d98000f91eb1b31bd7e64127a8682505f" gracePeriod=10 Oct 09 14:06:00 crc kubenswrapper[4902]: I1009 14:06:00.798169 4902 generic.go:334] "Generic (PLEG): container finished" podID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerID="daf0bc33851e3c2bf26b5abf9ac72b5d98000f91eb1b31bd7e64127a8682505f" exitCode=0 Oct 09 14:06:00 crc kubenswrapper[4902]: I1009 14:06:00.798265 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wmxc2" event={"ID":"3ede4964-25d2-4a68-bfb7-84a9cbf633c6","Type":"ContainerDied","Data":"daf0bc33851e3c2bf26b5abf9ac72b5d98000f91eb1b31bd7e64127a8682505f"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.091018 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.175741 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb\") pod \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.176119 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc\") pod \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.176166 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config\") pod \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.176245 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb\") pod \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.176291 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxmk7\" (UniqueName: \"kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7\") pod \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\" (UID: \"3ede4964-25d2-4a68-bfb7-84a9cbf633c6\") " Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.186758 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7" (OuterVolumeSpecName: "kube-api-access-cxmk7") pod "3ede4964-25d2-4a68-bfb7-84a9cbf633c6" (UID: "3ede4964-25d2-4a68-bfb7-84a9cbf633c6"). InnerVolumeSpecName "kube-api-access-cxmk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.232318 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config" (OuterVolumeSpecName: "config") pod "3ede4964-25d2-4a68-bfb7-84a9cbf633c6" (UID: "3ede4964-25d2-4a68-bfb7-84a9cbf633c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.247801 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ede4964-25d2-4a68-bfb7-84a9cbf633c6" (UID: "3ede4964-25d2-4a68-bfb7-84a9cbf633c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.251615 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ede4964-25d2-4a68-bfb7-84a9cbf633c6" (UID: "3ede4964-25d2-4a68-bfb7-84a9cbf633c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.259896 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ede4964-25d2-4a68-bfb7-84a9cbf633c6" (UID: "3ede4964-25d2-4a68-bfb7-84a9cbf633c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.278643 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.278689 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.278706 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.278730 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxmk7\" (UniqueName: \"kubernetes.io/projected/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-kube-api-access-cxmk7\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.278742 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ede4964-25d2-4a68-bfb7-84a9cbf633c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.457524 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xtkdj"] Oct 09 14:06:03 crc kubenswrapper[4902]: W1009 14:06:03.465155 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacda5664_02c5_48ec_92a8_b7f2274d34a7.slice/crio-c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91 WatchSource:0}: Error finding container c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91: Status 404 returned error can't find the container with id c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91 Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.467097 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6rs8p"] Oct 09 14:06:03 crc kubenswrapper[4902]: W1009 14:06:03.471864 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74606141_205f_4e63_9be4_1c7c57fecf3b.slice/crio-700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc WatchSource:0}: Error finding container 700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc: Status 404 returned error can't find the container with id 700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.507336 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qk2p7"] Oct 09 14:06:03 crc kubenswrapper[4902]: W1009 14:06:03.510871 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca2b155e_e715_4cbd_8d0e_41944380c158.slice/crio-0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879 WatchSource:0}: Error finding container 0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879: Status 404 returned error can't find the container with id 0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879 Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.580812 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9q89d"] Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.837589 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qk2p7" event={"ID":"ca2b155e-e715-4cbd-8d0e-41944380c158","Type":"ContainerStarted","Data":"7b0306fdcaa0ccafb2093670cdd941dc2050fd83511ceab9217072e1024c84e5"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.837646 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qk2p7" event={"ID":"ca2b155e-e715-4cbd-8d0e-41944380c158","Type":"ContainerStarted","Data":"0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.839750 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9x8lm" event={"ID":"15cef403-03b5-432b-a7b4-c12a613e4d47","Type":"ContainerStarted","Data":"967817fe37cdb809a8b18c6629dfe923d474e3695ea7664a196320318d0a9207"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.842017 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wmxc2" event={"ID":"3ede4964-25d2-4a68-bfb7-84a9cbf633c6","Type":"ContainerDied","Data":"a516b8c9fee0f0eb5dcb0da9c294b3296a407efe061e72972121f5c0f8a74957"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.842067 4902 scope.go:117] "RemoveContainer" containerID="daf0bc33851e3c2bf26b5abf9ac72b5d98000f91eb1b31bd7e64127a8682505f" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.842285 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wmxc2" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.844462 4902 generic.go:334] "Generic (PLEG): container finished" podID="acda5664-02c5-48ec-92a8-b7f2274d34a7" containerID="7c6f35af69c72387851257c9f1dcd774f87345a47f1009133d736adfd6ae360d" exitCode=0 Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.844554 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6rs8p" event={"ID":"acda5664-02c5-48ec-92a8-b7f2274d34a7","Type":"ContainerDied","Data":"7c6f35af69c72387851257c9f1dcd774f87345a47f1009133d736adfd6ae360d"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.844808 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6rs8p" event={"ID":"acda5664-02c5-48ec-92a8-b7f2274d34a7","Type":"ContainerStarted","Data":"c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.846103 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9q89d" event={"ID":"947813b1-569a-485d-80de-98259162031c","Type":"ContainerStarted","Data":"6e4af23529a1f854346f6beaf5c6da3edbcffa61f984196643c2956871072757"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.846150 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9q89d" event={"ID":"947813b1-569a-485d-80de-98259162031c","Type":"ContainerStarted","Data":"14a3b6c6a3804d8da9e0f2dc043687846ea744b490092f2f96026e377270e0dc"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.848722 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtkdj" event={"ID":"74606141-205f-4e63-9be4-1c7c57fecf3b","Type":"ContainerStarted","Data":"700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc"} Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.858447 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qk2p7" podStartSLOduration=7.858426806 podStartE2EDuration="7.858426806s" podCreationTimestamp="2025-10-09 14:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:03.857764746 +0000 UTC m=+911.055623830" watchObservedRunningTime="2025-10-09 14:06:03.858426806 +0000 UTC m=+911.056285880" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.879874 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-9q89d" podStartSLOduration=7.879851576 podStartE2EDuration="7.879851576s" podCreationTimestamp="2025-10-09 14:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:03.875404861 +0000 UTC m=+911.073263935" watchObservedRunningTime="2025-10-09 14:06:03.879851576 +0000 UTC m=+911.077710650" Oct 09 14:06:03 crc kubenswrapper[4902]: I1009 14:06:03.907119 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9x8lm" podStartSLOduration=2.396682052 podStartE2EDuration="15.907102673s" podCreationTimestamp="2025-10-09 14:05:48 +0000 UTC" firstStartedPulling="2025-10-09 14:05:49.453670182 +0000 UTC m=+896.651529246" lastFinishedPulling="2025-10-09 14:06:02.964090763 +0000 UTC m=+910.161949867" observedRunningTime="2025-10-09 14:06:03.906535915 +0000 UTC m=+911.104394999" watchObservedRunningTime="2025-10-09 14:06:03.907102673 +0000 UTC m=+911.104961737" Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.008898 4902 scope.go:117] "RemoveContainer" containerID="0c755799b9c3517eeee06229684e9846911aa4df66cae67bfef29f4501546772" Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.011558 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.017137 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wmxc2"] Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.860624 4902 generic.go:334] "Generic (PLEG): container finished" podID="947813b1-569a-485d-80de-98259162031c" containerID="6e4af23529a1f854346f6beaf5c6da3edbcffa61f984196643c2956871072757" exitCode=0 Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.860753 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9q89d" event={"ID":"947813b1-569a-485d-80de-98259162031c","Type":"ContainerDied","Data":"6e4af23529a1f854346f6beaf5c6da3edbcffa61f984196643c2956871072757"} Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.863544 4902 generic.go:334] "Generic (PLEG): container finished" podID="ca2b155e-e715-4cbd-8d0e-41944380c158" containerID="7b0306fdcaa0ccafb2093670cdd941dc2050fd83511ceab9217072e1024c84e5" exitCode=0 Oct 09 14:06:04 crc kubenswrapper[4902]: I1009 14:06:04.863648 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qk2p7" event={"ID":"ca2b155e-e715-4cbd-8d0e-41944380c158","Type":"ContainerDied","Data":"7b0306fdcaa0ccafb2093670cdd941dc2050fd83511ceab9217072e1024c84e5"} Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.170516 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6rs8p" Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.314318 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qs6w\" (UniqueName: \"kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w\") pod \"acda5664-02c5-48ec-92a8-b7f2274d34a7\" (UID: \"acda5664-02c5-48ec-92a8-b7f2274d34a7\") " Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.321168 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w" (OuterVolumeSpecName: "kube-api-access-8qs6w") pod "acda5664-02c5-48ec-92a8-b7f2274d34a7" (UID: "acda5664-02c5-48ec-92a8-b7f2274d34a7"). InnerVolumeSpecName "kube-api-access-8qs6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.417474 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qs6w\" (UniqueName: \"kubernetes.io/projected/acda5664-02c5-48ec-92a8-b7f2274d34a7-kube-api-access-8qs6w\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.526736 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" path="/var/lib/kubelet/pods/3ede4964-25d2-4a68-bfb7-84a9cbf633c6/volumes" Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.879842 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6rs8p" event={"ID":"acda5664-02c5-48ec-92a8-b7f2274d34a7","Type":"ContainerDied","Data":"c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91"} Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.879889 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14db8dbb11806f9f3bcec8d323507d619880c9aed70dda9e714126decb10f91" Oct 09 14:06:05 crc kubenswrapper[4902]: I1009 14:06:05.879991 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6rs8p" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.310427 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qk2p7" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.311314 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9q89d" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.373047 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5524\" (UniqueName: \"kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524\") pod \"ca2b155e-e715-4cbd-8d0e-41944380c158\" (UID: \"ca2b155e-e715-4cbd-8d0e-41944380c158\") " Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.373162 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2dvr\" (UniqueName: \"kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr\") pod \"947813b1-569a-485d-80de-98259162031c\" (UID: \"947813b1-569a-485d-80de-98259162031c\") " Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.377874 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr" (OuterVolumeSpecName: "kube-api-access-l2dvr") pod "947813b1-569a-485d-80de-98259162031c" (UID: "947813b1-569a-485d-80de-98259162031c"). InnerVolumeSpecName "kube-api-access-l2dvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.380979 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524" (OuterVolumeSpecName: "kube-api-access-p5524") pod "ca2b155e-e715-4cbd-8d0e-41944380c158" (UID: "ca2b155e-e715-4cbd-8d0e-41944380c158"). InnerVolumeSpecName "kube-api-access-p5524". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.475294 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5524\" (UniqueName: \"kubernetes.io/projected/ca2b155e-e715-4cbd-8d0e-41944380c158-kube-api-access-p5524\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.475330 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2dvr\" (UniqueName: \"kubernetes.io/projected/947813b1-569a-485d-80de-98259162031c-kube-api-access-l2dvr\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.902831 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qk2p7" event={"ID":"ca2b155e-e715-4cbd-8d0e-41944380c158","Type":"ContainerDied","Data":"0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879"} Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.903160 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cff05a2b96f37844081821c90e7a76175a75889b45769f439d9e50f90ec4879" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.903275 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qk2p7" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.904765 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9q89d" event={"ID":"947813b1-569a-485d-80de-98259162031c","Type":"ContainerDied","Data":"14a3b6c6a3804d8da9e0f2dc043687846ea744b490092f2f96026e377270e0dc"} Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.904844 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a3b6c6a3804d8da9e0f2dc043687846ea744b490092f2f96026e377270e0dc" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.904932 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9q89d" Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.906456 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtkdj" event={"ID":"74606141-205f-4e63-9be4-1c7c57fecf3b","Type":"ContainerStarted","Data":"a42c93937cbc00fea898e805942c7cf81cb448aac4ebe11387f3d94204535416"} Oct 09 14:06:08 crc kubenswrapper[4902]: I1009 14:06:08.930687 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xtkdj" podStartSLOduration=7.117386981 podStartE2EDuration="11.930442376s" podCreationTimestamp="2025-10-09 14:05:57 +0000 UTC" firstStartedPulling="2025-10-09 14:06:03.481561337 +0000 UTC m=+910.679420401" lastFinishedPulling="2025-10-09 14:06:08.294616732 +0000 UTC m=+915.492475796" observedRunningTime="2025-10-09 14:06:08.923360792 +0000 UTC m=+916.121219866" watchObservedRunningTime="2025-10-09 14:06:08.930442376 +0000 UTC m=+916.128301450" Oct 09 14:06:11 crc kubenswrapper[4902]: I1009 14:06:11.931724 4902 generic.go:334] "Generic (PLEG): container finished" podID="74606141-205f-4e63-9be4-1c7c57fecf3b" containerID="a42c93937cbc00fea898e805942c7cf81cb448aac4ebe11387f3d94204535416" exitCode=0 Oct 09 14:06:11 crc kubenswrapper[4902]: I1009 14:06:11.931817 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtkdj" event={"ID":"74606141-205f-4e63-9be4-1c7c57fecf3b","Type":"ContainerDied","Data":"a42c93937cbc00fea898e805942c7cf81cb448aac4ebe11387f3d94204535416"} Oct 09 14:06:11 crc kubenswrapper[4902]: I1009 14:06:11.934370 4902 generic.go:334] "Generic (PLEG): container finished" podID="15cef403-03b5-432b-a7b4-c12a613e4d47" containerID="967817fe37cdb809a8b18c6629dfe923d474e3695ea7664a196320318d0a9207" exitCode=0 Oct 09 14:06:11 crc kubenswrapper[4902]: I1009 14:06:11.934405 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9x8lm" event={"ID":"15cef403-03b5-432b-a7b4-c12a613e4d47","Type":"ContainerDied","Data":"967817fe37cdb809a8b18c6629dfe923d474e3695ea7664a196320318d0a9207"} Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.260558 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.351676 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle\") pod \"74606141-205f-4e63-9be4-1c7c57fecf3b\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.351782 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2btn\" (UniqueName: \"kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn\") pod \"74606141-205f-4e63-9be4-1c7c57fecf3b\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.352476 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data\") pod \"74606141-205f-4e63-9be4-1c7c57fecf3b\" (UID: \"74606141-205f-4e63-9be4-1c7c57fecf3b\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.357156 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn" (OuterVolumeSpecName: "kube-api-access-t2btn") pod "74606141-205f-4e63-9be4-1c7c57fecf3b" (UID: "74606141-205f-4e63-9be4-1c7c57fecf3b"). InnerVolumeSpecName "kube-api-access-t2btn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.399654 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74606141-205f-4e63-9be4-1c7c57fecf3b" (UID: "74606141-205f-4e63-9be4-1c7c57fecf3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.408860 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data" (OuterVolumeSpecName: "config-data") pod "74606141-205f-4e63-9be4-1c7c57fecf3b" (UID: "74606141-205f-4e63-9be4-1c7c57fecf3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.454129 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.454159 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2btn\" (UniqueName: \"kubernetes.io/projected/74606141-205f-4e63-9be4-1c7c57fecf3b-kube-api-access-t2btn\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.454174 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74606141-205f-4e63-9be4-1c7c57fecf3b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.472177 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9x8lm" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.555700 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data\") pod \"15cef403-03b5-432b-a7b4-c12a613e4d47\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.555760 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtwm\" (UniqueName: \"kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm\") pod \"15cef403-03b5-432b-a7b4-c12a613e4d47\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.555825 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data\") pod \"15cef403-03b5-432b-a7b4-c12a613e4d47\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.555866 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle\") pod \"15cef403-03b5-432b-a7b4-c12a613e4d47\" (UID: \"15cef403-03b5-432b-a7b4-c12a613e4d47\") " Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.572149 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm" (OuterVolumeSpecName: "kube-api-access-rbtwm") pod "15cef403-03b5-432b-a7b4-c12a613e4d47" (UID: "15cef403-03b5-432b-a7b4-c12a613e4d47"). InnerVolumeSpecName "kube-api-access-rbtwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.578866 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15cef403-03b5-432b-a7b4-c12a613e4d47" (UID: "15cef403-03b5-432b-a7b4-c12a613e4d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.589971 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15cef403-03b5-432b-a7b4-c12a613e4d47" (UID: "15cef403-03b5-432b-a7b4-c12a613e4d47"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.593523 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data" (OuterVolumeSpecName: "config-data") pod "15cef403-03b5-432b-a7b4-c12a613e4d47" (UID: "15cef403-03b5-432b-a7b4-c12a613e4d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.658145 4902 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.658178 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbtwm\" (UniqueName: \"kubernetes.io/projected/15cef403-03b5-432b-a7b4-c12a613e4d47-kube-api-access-rbtwm\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.658189 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.658239 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cef403-03b5-432b-a7b4-c12a613e4d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.953495 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xtkdj" event={"ID":"74606141-205f-4e63-9be4-1c7c57fecf3b","Type":"ContainerDied","Data":"700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc"} Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.953535 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xtkdj" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.953549 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700a32a77f2a202c71364c7aaeefd8cac9cdc91ba778baa67abc0f9a6530c3cc" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.954789 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9x8lm" event={"ID":"15cef403-03b5-432b-a7b4-c12a613e4d47","Type":"ContainerDied","Data":"5e5a12e91dcd5529d2d24af7fbbdfd92fe9728762de8ec8c08265930d301a94e"} Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.954809 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e5a12e91dcd5529d2d24af7fbbdfd92fe9728762de8ec8c08265930d301a94e" Oct 09 14:06:13 crc kubenswrapper[4902]: I1009 14:06:13.954865 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9x8lm" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.237552 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.237987 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947813b1-569a-485d-80de-98259162031c" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238002 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="947813b1-569a-485d-80de-98259162031c" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238021 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cef403-03b5-432b-a7b4-c12a613e4d47" containerName="glance-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238029 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cef403-03b5-432b-a7b4-c12a613e4d47" containerName="glance-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238040 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74606141-205f-4e63-9be4-1c7c57fecf3b" containerName="keystone-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238047 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="74606141-205f-4e63-9be4-1c7c57fecf3b" containerName="keystone-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238060 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="init" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238067 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="init" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238091 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2b155e-e715-4cbd-8d0e-41944380c158" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238098 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2b155e-e715-4cbd-8d0e-41944380c158" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238108 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="dnsmasq-dns" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238117 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="dnsmasq-dns" Oct 09 14:06:14 crc kubenswrapper[4902]: E1009 14:06:14.238141 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acda5664-02c5-48ec-92a8-b7f2274d34a7" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238150 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="acda5664-02c5-48ec-92a8-b7f2274d34a7" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238573 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cef403-03b5-432b-a7b4-c12a613e4d47" containerName="glance-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238599 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ede4964-25d2-4a68-bfb7-84a9cbf633c6" containerName="dnsmasq-dns" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238617 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="acda5664-02c5-48ec-92a8-b7f2274d34a7" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238630 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="947813b1-569a-485d-80de-98259162031c" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238652 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2b155e-e715-4cbd-8d0e-41944380c158" containerName="mariadb-database-create" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.238665 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="74606141-205f-4e63-9be4-1c7c57fecf3b" containerName="keystone-db-sync" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.239825 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.250813 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.266182 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-64rsx"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.267261 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.273751 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.274103 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ws99n" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.274309 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.274533 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.335263 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-64rsx"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379527 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379597 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379637 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt26r\" (UniqueName: \"kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379656 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379675 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379700 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379717 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379776 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379835 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5648\" (UniqueName: \"kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379856 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379875 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.379922 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.436492 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.441578 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.446900 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-tn5h4" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.447129 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.447289 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.456186 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.459738 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485094 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485143 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt26r\" (UniqueName: \"kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485163 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485189 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485207 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485226 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485247 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485303 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5648\" (UniqueName: \"kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485324 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485344 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485384 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.485437 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.487325 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.488002 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.488507 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.488516 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.488888 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.504148 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.512456 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.514635 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.515094 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.521634 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.539977 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5648\" (UniqueName: \"kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648\") pod \"keystone-bootstrap-64rsx\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.543030 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt26r\" (UniqueName: \"kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r\") pod \"dnsmasq-dns-55fff446b9-xhcmw\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.560244 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.566753 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.568660 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.575555 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.575867 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.585501 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.588049 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.588184 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.588241 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.588290 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqgw\" (UniqueName: \"kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.588312 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.617727 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.664635 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.704558 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.704893 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.704951 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqgw\" (UniqueName: \"kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.704981 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705011 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705043 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705059 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705186 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssx75\" (UniqueName: \"kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705229 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705254 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705282 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705358 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705394 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.705830 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.715285 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.716181 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.716449 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.731561 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.733513 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.735138 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.810309 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqgw\" (UniqueName: \"kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw\") pod \"horizon-6f8f55c9df-4zdc6\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.811615 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.812228 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8szfj" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.812863 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.828669 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jqrks"] Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.847140 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.882166 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.887573 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.887810 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.887866 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.887938 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888068 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888115 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888142 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888167 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888198 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888241 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888304 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.888368 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v29q\" (UniqueName: \"kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.908276 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssx75\" (UniqueName: \"kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.912133 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.918564 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.938951 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xphx7" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.941634 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.944458 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.957753 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.957915 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:14 crc kubenswrapper[4902]: I1009 14:06:14.961635 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.011930 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.014496 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.063294 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssx75\" (UniqueName: \"kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75\") pod \"ceilometer-0\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " pod="openstack/ceilometer-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125290 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125351 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6pp\" (UniqueName: \"kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125388 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125432 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125466 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125514 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125546 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125590 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125613 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v29q\" (UniqueName: \"kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125679 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125705 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125739 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.125785 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.141466 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.142161 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.142891 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.143853 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.147500 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.165053 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.201389 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v29q\" (UniqueName: \"kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q\") pod \"dnsmasq-dns-5c5cc7c5ff-2bwjh\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.205915 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.208278 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232660 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232751 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232790 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232827 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232855 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232904 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6pp\" (UniqueName: \"kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.232968 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233009 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233036 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233083 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233110 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87l49\" (UniqueName: \"kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233134 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.233862 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.234377 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.241046 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.260596 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.270498 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqrks"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.291576 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.291667 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.297807 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.298590 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.326805 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.327623 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6pp\" (UniqueName: \"kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.336660 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87l49\" (UniqueName: \"kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.337841 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.338052 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.338194 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqlvt\" (UniqueName: \"kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.338293 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.338533 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.339132 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.340109 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.340206 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.340376 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.342378 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.358603 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.360184 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.360820 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.361096 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.361140 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.361326 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.370687 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.385762 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.391853 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87l49\" (UniqueName: \"kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49\") pod \"placement-db-sync-jqrks\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442505 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442595 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442629 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442670 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442693 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442734 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442798 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442822 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442859 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqlvt\" (UniqueName: \"kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442890 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.442924 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64qp6\" (UniqueName: \"kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.444662 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.445793 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.445966 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.456155 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.466325 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqlvt\" (UniqueName: \"kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt\") pod \"horizon-659b68cf89-l9cwg\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.544978 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.545041 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.545141 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.545208 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.545259 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.545290 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64qp6\" (UniqueName: \"kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.546657 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.547335 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.547837 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.549667 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.550154 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.557843 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.570582 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64qp6\" (UniqueName: \"kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6\") pod \"dnsmasq-dns-8b5c85b87-pfr9v\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.639112 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:15 crc kubenswrapper[4902]: W1009 14:06:15.640187 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb082784_fdb2_4a03_b069_0ab81e675535.slice/crio-e3574fd1384e459b05292682a1c78d51cdd27795693aae70d23bacf13f3ebd3d WatchSource:0}: Error finding container e3574fd1384e459b05292682a1c78d51cdd27795693aae70d23bacf13f3ebd3d: Status 404 returned error can't find the container with id e3574fd1384e459b05292682a1c78d51cdd27795693aae70d23bacf13f3ebd3d Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.667466 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqrks" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.738066 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.749234 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.769320 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-64rsx"] Oct 09 14:06:15 crc kubenswrapper[4902]: W1009 14:06:15.804284 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd872d3cb_00ca_449c_98e2_69eb864ee9cb.slice/crio-f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68 WatchSource:0}: Error finding container f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68: Status 404 returned error can't find the container with id f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68 Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.907471 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:06:15 crc kubenswrapper[4902]: I1009 14:06:15.987118 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.044492 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.047602 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.049261 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerStarted","Data":"71c782eac08995ce249f78bbd12627a2151f96311460f3c0f3cabb572478715b"} Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.050932 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.054492 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" event={"ID":"fb082784-fdb2-4a03-b069-0ab81e675535","Type":"ContainerStarted","Data":"e3574fd1384e459b05292682a1c78d51cdd27795693aae70d23bacf13f3ebd3d"} Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.057511 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-64rsx" event={"ID":"d872d3cb-00ca-449c-98e2-69eb864ee9cb","Type":"ContainerStarted","Data":"f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68"} Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.068918 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.082045 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157343 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157402 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdvn\" (UniqueName: \"kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157486 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157609 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157671 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157741 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.157786 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.259530 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.259883 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdvn\" (UniqueName: \"kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.259945 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.260039 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.260069 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.260133 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.260178 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.262354 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.262466 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.262788 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.267571 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.272676 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.276342 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.310505 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdvn\" (UniqueName: \"kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.324838 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.378062 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:16 crc kubenswrapper[4902]: W1009 14:06:16.379184 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee7c6723_ebca_46a0_b74f_ad7322603a8c.slice/crio-c4ef5b1ad614fd1c73eba24087444b87bdc73437b349bcca30597074a2d5f144 WatchSource:0}: Error finding container c4ef5b1ad614fd1c73eba24087444b87bdc73437b349bcca30597074a2d5f144: Status 404 returned error can't find the container with id c4ef5b1ad614fd1c73eba24087444b87bdc73437b349bcca30597074a2d5f144 Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.390815 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.402606 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jqrks"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.437197 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:06:16 crc kubenswrapper[4902]: W1009 14:06:16.479150 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bfb704d_cf57_4181_b6d1_c5884492984a.slice/crio-5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b WatchSource:0}: Error finding container 5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b: Status 404 returned error can't find the container with id 5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.573551 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.721024 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fce1-account-create-8hwdj"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.722490 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.726269 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.728676 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fce1-account-create-8hwdj"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.777561 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzxs\" (UniqueName: \"kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs\") pod \"cinder-fce1-account-create-8hwdj\" (UID: \"b125855d-ef08-4d86-b4ca-a89b06964590\") " pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.824464 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-97aa-account-create-tzndb"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.826198 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.828584 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.831826 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97aa-account-create-tzndb"] Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.879955 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzxs\" (UniqueName: \"kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs\") pod \"cinder-fce1-account-create-8hwdj\" (UID: \"b125855d-ef08-4d86-b4ca-a89b06964590\") " pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.880038 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf2b\" (UniqueName: \"kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b\") pod \"barbican-97aa-account-create-tzndb\" (UID: \"cf729a68-4f96-4839-8c65-8be1543d04da\") " pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.900648 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzxs\" (UniqueName: \"kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs\") pod \"cinder-fce1-account-create-8hwdj\" (UID: \"b125855d-ef08-4d86-b4ca-a89b06964590\") " pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:16 crc kubenswrapper[4902]: I1009 14:06:16.981729 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf2b\" (UniqueName: \"kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b\") pod \"barbican-97aa-account-create-tzndb\" (UID: \"cf729a68-4f96-4839-8c65-8be1543d04da\") " pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.004201 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf2b\" (UniqueName: \"kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b\") pod \"barbican-97aa-account-create-tzndb\" (UID: \"cf729a68-4f96-4839-8c65-8be1543d04da\") " pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.053957 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.099193 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqrks" event={"ID":"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5","Type":"ContainerStarted","Data":"601aea65acd2c438cace694800601499864f32344289002a385a8e94950e0189"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.103861 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.115572 4902 generic.go:334] "Generic (PLEG): container finished" podID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerID="6d8db348cc6f0ce0b975e23dd8277864a2c4228157c8902f9bec4ac0016fbc26" exitCode=0 Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.122148 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" event={"ID":"1f63d2cd-4740-45c6-a94b-26899b7ffa86","Type":"ContainerDied","Data":"6d8db348cc6f0ce0b975e23dd8277864a2c4228157c8902f9bec4ac0016fbc26"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.122197 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" event={"ID":"1f63d2cd-4740-45c6-a94b-26899b7ffa86","Type":"ContainerStarted","Data":"23c15b61828b97429fbc5a0aa062d802bd857a6dd73e7013d34f8cd558b67ca8"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.124385 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-64rsx" event={"ID":"d872d3cb-00ca-449c-98e2-69eb864ee9cb","Type":"ContainerStarted","Data":"f2275601f7eb23d0a2ed3839e6ea247d201a99601b631a0bde47632a7411154b"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.131441 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerStarted","Data":"81f9d18967778391a4fc19ac465b4203e280e125d6241fd127507b3c2bcf2775"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.139378 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.151494 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4963-account-create-z5jmr"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.153051 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.157952 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.164858 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.176099 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.191998 4902 generic.go:334] "Generic (PLEG): container finished" podID="fb082784-fdb2-4a03-b069-0ab81e675535" containerID="fe8e5a64dcc70d3ebc4b19e75ed17f5456616356ce17d8c90d286c4d4234eb26" exitCode=0 Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.192303 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" event={"ID":"fb082784-fdb2-4a03-b069-0ab81e675535","Type":"ContainerDied","Data":"fe8e5a64dcc70d3ebc4b19e75ed17f5456616356ce17d8c90d286c4d4234eb26"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.217067 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerStarted","Data":"c4ef5b1ad614fd1c73eba24087444b87bdc73437b349bcca30597074a2d5f144"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.233083 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4963-account-create-z5jmr"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.240060 4902 generic.go:334] "Generic (PLEG): container finished" podID="8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" containerID="e7ad799668bbb9be0fbfa009ace77410b043b357caad24fdac9ca5afa808ccb3" exitCode=0 Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.240145 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" event={"ID":"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658","Type":"ContainerDied","Data":"e7ad799668bbb9be0fbfa009ace77410b043b357caad24fdac9ca5afa808ccb3"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.240170 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" event={"ID":"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658","Type":"ContainerStarted","Data":"d2cb15c8145db3f9c3ffb8263a3e61b94677c9f303ba39bc9515e84e68ee3c20"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.258944 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.260211 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.271297 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.308020 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lckg8\" (UniqueName: \"kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8\") pod \"neutron-4963-account-create-z5jmr\" (UID: \"453bc6c7-4789-47fb-a83d-21062c6069dd\") " pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.327296 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659b68cf89-l9cwg" event={"ID":"2bfb704d-cf57-4181-b6d1-c5884492984a","Type":"ContainerStarted","Data":"5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.375688 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.383404 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-64rsx" podStartSLOduration=3.383378997 podStartE2EDuration="3.383378997s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:17.236630726 +0000 UTC m=+924.434489790" watchObservedRunningTime="2025-10-09 14:06:17.383378997 +0000 UTC m=+924.581238061" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416504 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lckg8\" (UniqueName: \"kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8\") pod \"neutron-4963-account-create-z5jmr\" (UID: \"453bc6c7-4789-47fb-a83d-21062c6069dd\") " pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416633 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8j87\" (UniqueName: \"kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416688 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416709 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416724 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.416757 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.427466 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.464458 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckg8\" (UniqueName: \"kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8\") pod \"neutron-4963-account-create-z5jmr\" (UID: \"453bc6c7-4789-47fb-a83d-21062c6069dd\") " pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.557858 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.557951 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.557989 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.558069 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.558280 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8j87\" (UniqueName: \"kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.570847 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.573822 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.573933 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.585695 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.589973 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.636046 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8j87\" (UniqueName: \"kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87\") pod \"horizon-6c8d64bf69-wprv5\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:17.657189 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.361110 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerStarted","Data":"7dfd23fe9907835f15fd309d55a0987174739b8c2893a28e10681489365e2dd4"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.382106 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerStarted","Data":"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.387915 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" event={"ID":"1f63d2cd-4740-45c6-a94b-26899b7ffa86","Type":"ContainerStarted","Data":"22f4e750bfe871e2f676c151db24c9040a256ac1c93d8b9e68e0c7352637ee5c"} Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.387970 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.417398 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" podStartSLOduration=4.417383217 podStartE2EDuration="4.417383217s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:18.415252702 +0000 UTC m=+925.613111776" watchObservedRunningTime="2025-10-09 14:06:18.417383217 +0000 UTC m=+925.615242281" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.770942 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.806783 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.807136 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.807180 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.807220 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt26r\" (UniqueName: \"kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.807616 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.807671 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc\") pod \"fb082784-fdb2-4a03-b069-0ab81e675535\" (UID: \"fb082784-fdb2-4a03-b069-0ab81e675535\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.833289 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r" (OuterVolumeSpecName: "kube-api-access-tt26r") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "kube-api-access-tt26r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.835966 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.838144 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.846331 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config" (OuterVolumeSpecName: "config") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.861739 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.866681 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.875128 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb082784-fdb2-4a03-b069-0ab81e675535" (UID: "fb082784-fdb2-4a03-b069-0ab81e675535"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.908977 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v29q\" (UniqueName: \"kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909038 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909109 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909235 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909272 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909319 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc\") pod \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\" (UID: \"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658\") " Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909803 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909828 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909840 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt26r\" (UniqueName: \"kubernetes.io/projected/fb082784-fdb2-4a03-b069-0ab81e675535-kube-api-access-tt26r\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909854 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909868 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.909879 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb082784-fdb2-4a03-b069-0ab81e675535-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.915187 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q" (OuterVolumeSpecName: "kube-api-access-4v29q") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "kube-api-access-4v29q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.931730 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.938299 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.938521 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config" (OuterVolumeSpecName: "config") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.945576 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:18 crc kubenswrapper[4902]: I1009 14:06:18.947125 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" (UID: "8b3f279a-1c66-4a4b-a4a9-8ebbe329a658"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014444 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014501 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014515 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014528 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v29q\" (UniqueName: \"kubernetes.io/projected/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-kube-api-access-4v29q\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014544 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.014555 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.021700 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fce1-account-create-8hwdj"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.046186 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.056829 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97aa-account-create-tzndb"] Oct 09 14:06:19 crc kubenswrapper[4902]: W1009 14:06:19.066823 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf729a68_4f96_4839_8c65_8be1543d04da.slice/crio-c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94 WatchSource:0}: Error finding container c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94: Status 404 returned error can't find the container with id c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94 Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.067206 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4963-account-create-z5jmr"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.421571 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerStarted","Data":"9c405260b4dd192a2567f78ac5d8ba0557f3df5b6284600546dd13bf12f31707"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.439955 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerStarted","Data":"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.440049 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-log" containerID="cri-o://f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" gracePeriod=30 Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.440172 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-httpd" containerID="cri-o://a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" gracePeriod=30 Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.444424 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" event={"ID":"8b3f279a-1c66-4a4b-a4a9-8ebbe329a658","Type":"ContainerDied","Data":"d2cb15c8145db3f9c3ffb8263a3e61b94677c9f303ba39bc9515e84e68ee3c20"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.444478 4902 scope.go:117] "RemoveContainer" containerID="e7ad799668bbb9be0fbfa009ace77410b043b357caad24fdac9ca5afa808ccb3" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.444616 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.461758 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.461738821 podStartE2EDuration="5.461738821s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:19.461316238 +0000 UTC m=+926.659175302" watchObservedRunningTime="2025-10-09 14:06:19.461738821 +0000 UTC m=+926.659597895" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.467902 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fce1-account-create-8hwdj" event={"ID":"b125855d-ef08-4d86-b4ca-a89b06964590","Type":"ContainerStarted","Data":"030348d56284994032db5fee8c54ac4375db733e175359e550e9e6d954b94427"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.472329 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerStarted","Data":"fe722b5fc3df156fdfc35c34c1bc441eb5672590cba5abe482f0898bd6a682de"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.480655 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4963-account-create-z5jmr" event={"ID":"453bc6c7-4789-47fb-a83d-21062c6069dd","Type":"ContainerStarted","Data":"b37de7d03070bb177ec93d05b5fc8e1b79a02e727934be7862ea3b3da2b13ce9"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.482421 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97aa-account-create-tzndb" event={"ID":"cf729a68-4f96-4839-8c65-8be1543d04da","Type":"ContainerStarted","Data":"c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.491849 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.497354 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-xhcmw" event={"ID":"fb082784-fdb2-4a03-b069-0ab81e675535","Type":"ContainerDied","Data":"e3574fd1384e459b05292682a1c78d51cdd27795693aae70d23bacf13f3ebd3d"} Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.526470 4902 scope.go:117] "RemoveContainer" containerID="fe8e5a64dcc70d3ebc4b19e75ed17f5456616356ce17d8c90d286c4d4234eb26" Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.564812 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.564985 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-2bwjh"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.700749 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:19 crc kubenswrapper[4902]: I1009 14:06:19.714949 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-xhcmw"] Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.078221 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.078712 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.082529 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.246775 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.246858 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.246922 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247034 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6pp\" (UniqueName: \"kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247077 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247100 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247150 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\" (UID: \"ee7c6723-ebca-46a0-b74f-ad7322603a8c\") " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247776 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs" (OuterVolumeSpecName: "logs") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.247812 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.268677 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.268788 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp" (OuterVolumeSpecName: "kube-api-access-9l6pp") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "kube-api-access-9l6pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.271699 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts" (OuterVolumeSpecName: "scripts") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.301565 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.326844 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data" (OuterVolumeSpecName: "config-data") pod "ee7c6723-ebca-46a0-b74f-ad7322603a8c" (UID: "ee7c6723-ebca-46a0-b74f-ad7322603a8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349558 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349603 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6pp\" (UniqueName: \"kubernetes.io/projected/ee7c6723-ebca-46a0-b74f-ad7322603a8c-kube-api-access-9l6pp\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349619 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349629 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349675 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349689 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7c6723-ebca-46a0-b74f-ad7322603a8c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.349700 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee7c6723-ebca-46a0-b74f-ad7322603a8c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.372909 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.452240 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.518352 4902 generic.go:334] "Generic (PLEG): container finished" podID="d872d3cb-00ca-449c-98e2-69eb864ee9cb" containerID="f2275601f7eb23d0a2ed3839e6ea247d201a99601b631a0bde47632a7411154b" exitCode=0 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.518475 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-64rsx" event={"ID":"d872d3cb-00ca-449c-98e2-69eb864ee9cb","Type":"ContainerDied","Data":"f2275601f7eb23d0a2ed3839e6ea247d201a99601b631a0bde47632a7411154b"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.524915 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerStarted","Data":"a6f8b8fcf57192356ceb344ff307f2ffb9f0284c957cf29a6e1a23f3d5243907"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.525049 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-log" containerID="cri-o://9c405260b4dd192a2567f78ac5d8ba0557f3df5b6284600546dd13bf12f31707" gracePeriod=30 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.525161 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-httpd" containerID="cri-o://a6f8b8fcf57192356ceb344ff307f2ffb9f0284c957cf29a6e1a23f3d5243907" gracePeriod=30 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528111 4902 generic.go:334] "Generic (PLEG): container finished" podID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerID="a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" exitCode=143 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528139 4902 generic.go:334] "Generic (PLEG): container finished" podID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerID="f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" exitCode=143 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528206 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerDied","Data":"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528257 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerDied","Data":"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528266 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528293 4902 scope.go:117] "RemoveContainer" containerID="a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.528274 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ee7c6723-ebca-46a0-b74f-ad7322603a8c","Type":"ContainerDied","Data":"c4ef5b1ad614fd1c73eba24087444b87bdc73437b349bcca30597074a2d5f144"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.548194 4902 generic.go:334] "Generic (PLEG): container finished" podID="b125855d-ef08-4d86-b4ca-a89b06964590" containerID="d65f098b3327a5bc6b4c231a31a7ae200c84332a143b46c70378caac2afd7525" exitCode=0 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.548304 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fce1-account-create-8hwdj" event={"ID":"b125855d-ef08-4d86-b4ca-a89b06964590","Type":"ContainerDied","Data":"d65f098b3327a5bc6b4c231a31a7ae200c84332a143b46c70378caac2afd7525"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.550364 4902 generic.go:334] "Generic (PLEG): container finished" podID="453bc6c7-4789-47fb-a83d-21062c6069dd" containerID="0d4a3a589386a9cd13bfb09da356a1cfd0ca577ef1346af683beed1ceea21658" exitCode=0 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.550452 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4963-account-create-z5jmr" event={"ID":"453bc6c7-4789-47fb-a83d-21062c6069dd","Type":"ContainerDied","Data":"0d4a3a589386a9cd13bfb09da356a1cfd0ca577ef1346af683beed1ceea21658"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.553699 4902 generic.go:334] "Generic (PLEG): container finished" podID="cf729a68-4f96-4839-8c65-8be1543d04da" containerID="54b7945ee3849ebd9c01e5165f770199cb6deab03768d690792ab8b6342d1f91" exitCode=0 Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.554454 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97aa-account-create-tzndb" event={"ID":"cf729a68-4f96-4839-8c65-8be1543d04da","Type":"ContainerDied","Data":"54b7945ee3849ebd9c01e5165f770199cb6deab03768d690792ab8b6342d1f91"} Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.580868 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.580844183 podStartE2EDuration="6.580844183s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:20.568155728 +0000 UTC m=+927.766014802" watchObservedRunningTime="2025-10-09 14:06:20.580844183 +0000 UTC m=+927.778703247" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.600572 4902 scope.go:117] "RemoveContainer" containerID="f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.630844 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.646119 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.684713 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.685791 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-httpd" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.685816 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-httpd" Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.685833 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb082784-fdb2-4a03-b069-0ab81e675535" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.685841 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb082784-fdb2-4a03-b069-0ab81e675535" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.685873 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.685880 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.685914 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-log" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.685921 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-log" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.686125 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-log" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.686150 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.686167 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb082784-fdb2-4a03-b069-0ab81e675535" containerName="init" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.686179 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" containerName="glance-httpd" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.687536 4902 scope.go:117] "RemoveContainer" containerID="a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.687705 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.689211 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3\": container with ID starting with a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3 not found: ID does not exist" containerID="a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.689255 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3"} err="failed to get container status \"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3\": rpc error: code = NotFound desc = could not find container \"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3\": container with ID starting with a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3 not found: ID does not exist" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.689297 4902 scope.go:117] "RemoveContainer" containerID="f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" Oct 09 14:06:20 crc kubenswrapper[4902]: E1009 14:06:20.689713 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60\": container with ID starting with f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60 not found: ID does not exist" containerID="f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.689755 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60"} err="failed to get container status \"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60\": rpc error: code = NotFound desc = could not find container \"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60\": container with ID starting with f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60 not found: ID does not exist" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.689791 4902 scope.go:117] "RemoveContainer" containerID="a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.693033 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.693460 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3"} err="failed to get container status \"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3\": rpc error: code = NotFound desc = could not find container \"a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3\": container with ID starting with a4574e049717c1dc72b040d8500aa99fb994a72047e275fd1e91b84afaf903f3 not found: ID does not exist" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.693507 4902 scope.go:117] "RemoveContainer" containerID="f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.694179 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60"} err="failed to get container status \"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60\": rpc error: code = NotFound desc = could not find container \"f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60\": container with ID starting with f2e4229d74660d52f6a02cea7cfcb304ad45b5289e0c34d5b1565b99bd542a60 not found: ID does not exist" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.704590 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859266 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859314 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859359 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859576 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859630 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859682 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.859719 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9jf\" (UniqueName: \"kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.961571 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.961684 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962253 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962588 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962681 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962710 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962742 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.962775 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9jf\" (UniqueName: \"kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.963476 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.963675 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.968068 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.968085 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.969532 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:20 crc kubenswrapper[4902]: I1009 14:06:20.986566 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9jf\" (UniqueName: \"kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.008458 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.093854 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.524685 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3f279a-1c66-4a4b-a4a9-8ebbe329a658" path="/var/lib/kubelet/pods/8b3f279a-1c66-4a4b-a4a9-8ebbe329a658/volumes" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.525819 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7c6723-ebca-46a0-b74f-ad7322603a8c" path="/var/lib/kubelet/pods/ee7c6723-ebca-46a0-b74f-ad7322603a8c/volumes" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.526451 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb082784-fdb2-4a03-b069-0ab81e675535" path="/var/lib/kubelet/pods/fb082784-fdb2-4a03-b069-0ab81e675535/volumes" Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.564837 4902 generic.go:334] "Generic (PLEG): container finished" podID="605240c8-9c62-4813-9954-57e1e5b2c742" containerID="a6f8b8fcf57192356ceb344ff307f2ffb9f0284c957cf29a6e1a23f3d5243907" exitCode=0 Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.564868 4902 generic.go:334] "Generic (PLEG): container finished" podID="605240c8-9c62-4813-9954-57e1e5b2c742" containerID="9c405260b4dd192a2567f78ac5d8ba0557f3df5b6284600546dd13bf12f31707" exitCode=143 Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.565345 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerDied","Data":"a6f8b8fcf57192356ceb344ff307f2ffb9f0284c957cf29a6e1a23f3d5243907"} Oct 09 14:06:21 crc kubenswrapper[4902]: I1009 14:06:21.565431 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerDied","Data":"9c405260b4dd192a2567f78ac5d8ba0557f3df5b6284600546dd13bf12f31707"} Oct 09 14:06:25 crc kubenswrapper[4902]: I1009 14:06:25.753566 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:06:25 crc kubenswrapper[4902]: I1009 14:06:25.823036 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:06:25 crc kubenswrapper[4902]: I1009 14:06:25.823308 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="dnsmasq-dns" containerID="cri-o://c754d8e263589a40ba5c58c0a76717ee00ec54a76618684145984a8f063cbfa1" gracePeriod=10 Oct 09 14:06:26 crc kubenswrapper[4902]: I1009 14:06:26.636019 4902 generic.go:334] "Generic (PLEG): container finished" podID="120030af-2560-4346-b479-fbecbad7c23e" containerID="c754d8e263589a40ba5c58c0a76717ee00ec54a76618684145984a8f063cbfa1" exitCode=0 Oct 09 14:06:26 crc kubenswrapper[4902]: I1009 14:06:26.636079 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" event={"ID":"120030af-2560-4346-b479-fbecbad7c23e","Type":"ContainerDied","Data":"c754d8e263589a40ba5c58c0a76717ee00ec54a76618684145984a8f063cbfa1"} Oct 09 14:06:27 crc kubenswrapper[4902]: I1009 14:06:27.760197 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.112672 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.146963 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.149020 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.152345 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.175824 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.244314 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.286028 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-779d95f9fb-tfjvq"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.287956 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.308978 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-779d95f9fb-tfjvq"] Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340355 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b9t\" (UniqueName: \"kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340439 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340521 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340617 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340648 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340685 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.340782 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442115 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-scripts\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442483 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b9t\" (UniqueName: \"kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442505 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442531 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e0f94d-30a4-456b-bfd4-7da1453facc4-logs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442558 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kw4\" (UniqueName: \"kubernetes.io/projected/40e0f94d-30a4-456b-bfd4-7da1453facc4-kube-api-access-n4kw4\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442618 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442682 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442733 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-config-data\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442783 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.442821 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-combined-ca-bundle\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.443271 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.443319 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.444236 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.444716 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-tls-certs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.444930 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.445043 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-secret-key\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.445646 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.450107 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.450487 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.454394 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.461879 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b9t\" (UniqueName: \"kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t\") pod \"horizon-5b9b5fc8fb-nlcpz\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.479331 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547039 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-scripts\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547109 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e0f94d-30a4-456b-bfd4-7da1453facc4-logs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547140 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kw4\" (UniqueName: \"kubernetes.io/projected/40e0f94d-30a4-456b-bfd4-7da1453facc4-kube-api-access-n4kw4\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547202 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-config-data\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547227 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-combined-ca-bundle\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547273 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-tls-certs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.547303 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-secret-key\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.548599 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e0f94d-30a4-456b-bfd4-7da1453facc4-logs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.549365 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-config-data\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.549764 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40e0f94d-30a4-456b-bfd4-7da1453facc4-scripts\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.553106 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-combined-ca-bundle\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.553117 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-secret-key\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.553426 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40e0f94d-30a4-456b-bfd4-7da1453facc4-horizon-tls-certs\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.568756 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kw4\" (UniqueName: \"kubernetes.io/projected/40e0f94d-30a4-456b-bfd4-7da1453facc4-kube-api-access-n4kw4\") pod \"horizon-779d95f9fb-tfjvq\" (UID: \"40e0f94d-30a4-456b-bfd4-7da1453facc4\") " pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:28 crc kubenswrapper[4902]: I1009 14:06:28.605829 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.297288 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.430527 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580698 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580765 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580838 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5648\" (UniqueName: \"kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580918 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580959 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.580981 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys\") pod \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\" (UID: \"d872d3cb-00ca-449c-98e2-69eb864ee9cb\") " Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.596538 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.613697 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648" (OuterVolumeSpecName: "kube-api-access-p5648") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "kube-api-access-p5648". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.617976 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts" (OuterVolumeSpecName: "scripts") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.618353 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.645314 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.660004 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data" (OuterVolumeSpecName: "config-data") pod "d872d3cb-00ca-449c-98e2-69eb864ee9cb" (UID: "d872d3cb-00ca-449c-98e2-69eb864ee9cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.675104 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-64rsx" event={"ID":"d872d3cb-00ca-449c-98e2-69eb864ee9cb","Type":"ContainerDied","Data":"f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68"} Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.675450 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3593f7e59a1b72e244971490c3f0bc9b8c2fbc1dcafc0ef6069966b92578b68" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.675157 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-64rsx" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.682950 4902 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.682983 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.682994 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5648\" (UniqueName: \"kubernetes.io/projected/d872d3cb-00ca-449c-98e2-69eb864ee9cb-kube-api-access-p5648\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.683005 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.683014 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:30 crc kubenswrapper[4902]: I1009 14:06:30.683024 4902 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d872d3cb-00ca-449c-98e2-69eb864ee9cb-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.626829 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-64rsx"] Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.634512 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-64rsx"] Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.722100 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-74gvb"] Oct 09 14:06:31 crc kubenswrapper[4902]: E1009 14:06:31.722783 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d872d3cb-00ca-449c-98e2-69eb864ee9cb" containerName="keystone-bootstrap" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.722802 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d872d3cb-00ca-449c-98e2-69eb864ee9cb" containerName="keystone-bootstrap" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.722971 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d872d3cb-00ca-449c-98e2-69eb864ee9cb" containerName="keystone-bootstrap" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.726554 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.730563 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.730812 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.730956 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.730953 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ws99n" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.737274 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-74gvb"] Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809185 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809249 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809270 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809315 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmst5\" (UniqueName: \"kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809634 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.809682 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911677 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911727 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911779 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmst5\" (UniqueName: \"kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911853 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911870 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.911900 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.917359 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.917606 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.918022 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.919045 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.927439 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmst5\" (UniqueName: \"kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:31 crc kubenswrapper[4902]: I1009 14:06:31.927453 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data\") pod \"keystone-bootstrap-74gvb\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.052275 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.193061 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.193404 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87l49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-jqrks_openstack(d3545e73-7c0f-4e1a-b012-da9e7f35a0b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.194572 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-jqrks" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.205175 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.205345 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n95hf8h557h567h658h685h68bh5dchbfh97h98h55fh57ch5bdhb7hbfh85h594h54h575h54fh7h674h566h656h64dh5cfh9h5bdh585h664h54q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qqlvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-659b68cf89-l9cwg_openstack(2bfb704d-cf57-4181-b6d1-c5884492984a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.354626 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.368864 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.368968 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.387314 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.390578 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535292 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535422 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhzxs\" (UniqueName: \"kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs\") pod \"b125855d-ef08-4d86-b4ca-a89b06964590\" (UID: \"b125855d-ef08-4d86-b4ca-a89b06964590\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535474 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535566 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535603 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535627 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535663 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhxv\" (UniqueName: \"kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.535941 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536005 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536031 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536054 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536079 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0\") pod \"120030af-2560-4346-b479-fbecbad7c23e\" (UID: \"120030af-2560-4346-b479-fbecbad7c23e\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536118 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lckg8\" (UniqueName: \"kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8\") pod \"453bc6c7-4789-47fb-a83d-21062c6069dd\" (UID: \"453bc6c7-4789-47fb-a83d-21062c6069dd\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536155 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536204 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qdvn\" (UniqueName: \"kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn\") pod \"605240c8-9c62-4813-9954-57e1e5b2c742\" (UID: \"605240c8-9c62-4813-9954-57e1e5b2c742\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.536307 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tf2b\" (UniqueName: \"kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b\") pod \"cf729a68-4f96-4839-8c65-8be1543d04da\" (UID: \"cf729a68-4f96-4839-8c65-8be1543d04da\") " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.537880 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.542265 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs" (OuterVolumeSpecName: "logs") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.570805 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b" (OuterVolumeSpecName: "kube-api-access-2tf2b") pod "cf729a68-4f96-4839-8c65-8be1543d04da" (UID: "cf729a68-4f96-4839-8c65-8be1543d04da"). InnerVolumeSpecName "kube-api-access-2tf2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.572330 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts" (OuterVolumeSpecName: "scripts") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.573660 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs" (OuterVolumeSpecName: "kube-api-access-bhzxs") pod "b125855d-ef08-4d86-b4ca-a89b06964590" (UID: "b125855d-ef08-4d86-b4ca-a89b06964590"). InnerVolumeSpecName "kube-api-access-bhzxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.575579 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8" (OuterVolumeSpecName: "kube-api-access-lckg8") pod "453bc6c7-4789-47fb-a83d-21062c6069dd" (UID: "453bc6c7-4789-47fb-a83d-21062c6069dd"). InnerVolumeSpecName "kube-api-access-lckg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.575664 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv" (OuterVolumeSpecName: "kube-api-access-8rhxv") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "kube-api-access-8rhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.606590 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.606765 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn" (OuterVolumeSpecName: "kube-api-access-7qdvn") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "kube-api-access-7qdvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639107 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qdvn\" (UniqueName: \"kubernetes.io/projected/605240c8-9c62-4813-9954-57e1e5b2c742-kube-api-access-7qdvn\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639160 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tf2b\" (UniqueName: \"kubernetes.io/projected/cf729a68-4f96-4839-8c65-8be1543d04da-kube-api-access-2tf2b\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639178 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639191 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhzxs\" (UniqueName: \"kubernetes.io/projected/b125855d-ef08-4d86-b4ca-a89b06964590-kube-api-access-bhzxs\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639204 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhxv\" (UniqueName: \"kubernetes.io/projected/120030af-2560-4346-b479-fbecbad7c23e-kube-api-access-8rhxv\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639241 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639256 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639269 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lckg8\" (UniqueName: \"kubernetes.io/projected/453bc6c7-4789-47fb-a83d-21062c6069dd-kube-api-access-lckg8\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.639283 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605240c8-9c62-4813-9954-57e1e5b2c742-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.694428 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97aa-account-create-tzndb" event={"ID":"cf729a68-4f96-4839-8c65-8be1543d04da","Type":"ContainerDied","Data":"c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94"} Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.694468 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97aa-account-create-tzndb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.694475 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8444a6eb6a349b148d3661001d75858e47b487ff30a8e3bac3333c1c2e8ae94" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.701519 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" event={"ID":"120030af-2560-4346-b479-fbecbad7c23e","Type":"ContainerDied","Data":"24ed8c3e6528ff836cc0dc6bc94c078c71a21ce577e67027846621788d19a2c6"} Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.701566 4902 scope.go:117] "RemoveContainer" containerID="c754d8e263589a40ba5c58c0a76717ee00ec54a76618684145984a8f063cbfa1" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.701629 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wtqhb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.709378 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"605240c8-9c62-4813-9954-57e1e5b2c742","Type":"ContainerDied","Data":"7dfd23fe9907835f15fd309d55a0987174739b8c2893a28e10681489365e2dd4"} Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.710989 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.715097 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fce1-account-create-8hwdj" event={"ID":"b125855d-ef08-4d86-b4ca-a89b06964590","Type":"ContainerDied","Data":"030348d56284994032db5fee8c54ac4375db733e175359e550e9e6d954b94427"} Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.715132 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030348d56284994032db5fee8c54ac4375db733e175359e550e9e6d954b94427" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.715190 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fce1-account-create-8hwdj" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.720872 4902 scope.go:117] "RemoveContainer" containerID="cc14b467e64783e2792c4a46e15f04af63c08c2a482f554958ad79d0cc4f5bfb" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.721813 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4963-account-create-z5jmr" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.721927 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4963-account-create-z5jmr" event={"ID":"453bc6c7-4789-47fb-a83d-21062c6069dd","Type":"ContainerDied","Data":"b37de7d03070bb177ec93d05b5fc8e1b79a02e727934be7862ea3b3da2b13ce9"} Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.721976 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b37de7d03070bb177ec93d05b5fc8e1b79a02e727934be7862ea3b3da2b13ce9" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.731118 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-jqrks" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" Oct 09 14:06:32 crc kubenswrapper[4902]: E1009 14:06:32.738717 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-659b68cf89-l9cwg" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.747930 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.755254 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.758166 4902 scope.go:117] "RemoveContainer" containerID="a6f8b8fcf57192356ceb344ff307f2ffb9f0284c957cf29a6e1a23f3d5243907" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.771922 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config" (OuterVolumeSpecName: "config") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.788034 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.790616 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.792589 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.797361 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data" (OuterVolumeSpecName: "config-data") pod "605240c8-9c62-4813-9954-57e1e5b2c742" (UID: "605240c8-9c62-4813-9954-57e1e5b2c742"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.802298 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "120030af-2560-4346-b479-fbecbad7c23e" (UID: "120030af-2560-4346-b479-fbecbad7c23e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.804186 4902 scope.go:117] "RemoveContainer" containerID="9c405260b4dd192a2567f78ac5d8ba0557f3df5b6284600546dd13bf12f31707" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.839430 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-779d95f9fb-tfjvq"] Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842092 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842132 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842145 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842156 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842168 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842201 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842215 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120030af-2560-4346-b479-fbecbad7c23e-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.842227 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605240c8-9c62-4813-9954-57e1e5b2c742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.928490 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.967779 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:06:32 crc kubenswrapper[4902]: I1009 14:06:32.974954 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-74gvb"] Oct 09 14:06:32 crc kubenswrapper[4902]: W1009 14:06:32.991157 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9fd56c3_4837_4fad_b343_b2edc61b0605.slice/crio-67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4 WatchSource:0}: Error finding container 67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4: Status 404 returned error can't find the container with id 67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.209442 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.237005 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wtqhb"] Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.257175 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.265374 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275085 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275495 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453bc6c7-4789-47fb-a83d-21062c6069dd" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275507 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="453bc6c7-4789-47fb-a83d-21062c6069dd" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275523 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf729a68-4f96-4839-8c65-8be1543d04da" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275528 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf729a68-4f96-4839-8c65-8be1543d04da" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275540 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-log" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275547 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-log" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275560 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="init" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275566 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="init" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275581 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="dnsmasq-dns" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275586 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="dnsmasq-dns" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275598 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125855d-ef08-4d86-b4ca-a89b06964590" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275604 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125855d-ef08-4d86-b4ca-a89b06964590" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: E1009 14:06:33.275616 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-httpd" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275621 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-httpd" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275781 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-log" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275797 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125855d-ef08-4d86-b4ca-a89b06964590" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275807 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf729a68-4f96-4839-8c65-8be1543d04da" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275818 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="120030af-2560-4346-b479-fbecbad7c23e" containerName="dnsmasq-dns" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275830 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" containerName="glance-httpd" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.275841 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="453bc6c7-4789-47fb-a83d-21062c6069dd" containerName="mariadb-account-create" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.276758 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.278502 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.279927 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.283628 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.452901 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453012 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453054 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453082 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453141 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqt7\" (UniqueName: \"kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453199 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453221 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.453241 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.538971 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120030af-2560-4346-b479-fbecbad7c23e" path="/var/lib/kubelet/pods/120030af-2560-4346-b479-fbecbad7c23e/volumes" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.542571 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605240c8-9c62-4813-9954-57e1e5b2c742" path="/var/lib/kubelet/pods/605240c8-9c62-4813-9954-57e1e5b2c742/volumes" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.544551 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d872d3cb-00ca-449c-98e2-69eb864ee9cb" path="/var/lib/kubelet/pods/d872d3cb-00ca-449c-98e2-69eb864ee9cb/volumes" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.555618 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.555707 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.555799 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.556094 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.556263 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.556327 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.556424 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqt7\" (UniqueName: \"kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.556529 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.557778 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.561384 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.561729 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.565791 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.572918 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.575616 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.588373 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.588977 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqt7\" (UniqueName: \"kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.633633 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.741912 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerStarted","Data":"7648107539a17a6742111598981f92afad08f59daebf801ec90d14ee396c2bc4"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.741967 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerStarted","Data":"c484c0b6ad59a404ab18f5630eef3d7e80e2a387470536ba00e3f93e3573f913"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.750823 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerStarted","Data":"85ec85e5a9c2d8fc19be2078040e1e8b57cda146fb710e97c4707c25a486e6c3"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.750874 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerStarted","Data":"d32f94923cc95a2e0f28f7ecd26583aaa7c93c89987d48381807c1c01d46f019"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.751010 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c8d64bf69-wprv5" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon-log" containerID="cri-o://d32f94923cc95a2e0f28f7ecd26583aaa7c93c89987d48381807c1c01d46f019" gracePeriod=30 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.751459 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c8d64bf69-wprv5" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon" containerID="cri-o://85ec85e5a9c2d8fc19be2078040e1e8b57cda146fb710e97c4707c25a486e6c3" gracePeriod=30 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.761965 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-779d95f9fb-tfjvq" event={"ID":"40e0f94d-30a4-456b-bfd4-7da1453facc4","Type":"ContainerStarted","Data":"d3c72d608c7c8a26cd8f6a93ce945044e640a6541a801f28149bf3473c76109a"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.762018 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-779d95f9fb-tfjvq" event={"ID":"40e0f94d-30a4-456b-bfd4-7da1453facc4","Type":"ContainerStarted","Data":"761f20a2a372b9d00ccd72730a020e160ef99cc22477f5f7c36ba08c3211c523"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.762062 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-779d95f9fb-tfjvq" event={"ID":"40e0f94d-30a4-456b-bfd4-7da1453facc4","Type":"ContainerStarted","Data":"b068510829f9b116b2cc56d733f17ce80aa588f25416408907ea9c59d03ddb9f"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.768733 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerStarted","Data":"119944c304e40ad4b30208df944e3587aea6322eb5c80031e9db64894b4ec2b9"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.783845 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerStarted","Data":"26401dfab58bf8d5306f89cdd551a4189699bc71c0c67bffa6d8975b03e98b74"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.783896 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerStarted","Data":"c56bd3765dfec2fab257f46ddb8a13f723af38fda399398302589b4e0d3a8b34"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.783910 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerStarted","Data":"9aaba01f0eb82ab28a90d2532ad6a660ee07e81b9aea35e9408c28c258ee2103"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.786284 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c8d64bf69-wprv5" podStartSLOduration=3.676768232 podStartE2EDuration="16.786266751s" podCreationTimestamp="2025-10-09 14:06:17 +0000 UTC" firstStartedPulling="2025-10-09 14:06:19.219856945 +0000 UTC m=+926.417716009" lastFinishedPulling="2025-10-09 14:06:32.329355454 +0000 UTC m=+939.527214528" observedRunningTime="2025-10-09 14:06:33.777736933 +0000 UTC m=+940.975596007" watchObservedRunningTime="2025-10-09 14:06:33.786266751 +0000 UTC m=+940.984125835" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.790313 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659b68cf89-l9cwg" event={"ID":"2bfb704d-cf57-4181-b6d1-c5884492984a","Type":"ContainerStarted","Data":"112d547199273642f0137b5c314d1495df860c7a46c5089daf5caaba0a916bdd"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.790489 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-659b68cf89-l9cwg" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" containerName="horizon" containerID="cri-o://112d547199273642f0137b5c314d1495df860c7a46c5089daf5caaba0a916bdd" gracePeriod=30 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.807393 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-74gvb" event={"ID":"f9fd56c3-4837-4fad-b343-b2edc61b0605","Type":"ContainerStarted","Data":"2d4ad1f7db7bac1656d7fa585602c537d6b102be11fcf14fc440f3379faa01c9"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.808334 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-74gvb" event={"ID":"f9fd56c3-4837-4fad-b343-b2edc61b0605","Type":"ContainerStarted","Data":"67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.813484 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-779d95f9fb-tfjvq" podStartSLOduration=5.813464206 podStartE2EDuration="5.813464206s" podCreationTimestamp="2025-10-09 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:33.805174685 +0000 UTC m=+941.003033779" watchObservedRunningTime="2025-10-09 14:06:33.813464206 +0000 UTC m=+941.011323270" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.836985 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerStarted","Data":"4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.837044 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerStarted","Data":"706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181"} Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.837201 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f8f55c9df-4zdc6" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon-log" containerID="cri-o://706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181" gracePeriod=30 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.837371 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f8f55c9df-4zdc6" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon" containerID="cri-o://4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375" gracePeriod=30 Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.907158 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.907957 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podStartSLOduration=5.907933331 podStartE2EDuration="5.907933331s" podCreationTimestamp="2025-10-09 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:33.874205169 +0000 UTC m=+941.072064243" watchObservedRunningTime="2025-10-09 14:06:33.907933331 +0000 UTC m=+941.105792415" Oct 09 14:06:33 crc kubenswrapper[4902]: I1009 14:06:33.908968 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-74gvb" podStartSLOduration=2.908962193 podStartE2EDuration="2.908962193s" podCreationTimestamp="2025-10-09 14:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:33.898103673 +0000 UTC m=+941.095962737" watchObservedRunningTime="2025-10-09 14:06:33.908962193 +0000 UTC m=+941.106821277" Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.541449 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f8f55c9df-4zdc6" podStartSLOduration=4.133948431 podStartE2EDuration="20.541424645s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="2025-10-09 14:06:15.930733259 +0000 UTC m=+923.128592323" lastFinishedPulling="2025-10-09 14:06:32.338209473 +0000 UTC m=+939.536068537" observedRunningTime="2025-10-09 14:06:33.918319496 +0000 UTC m=+941.116178590" watchObservedRunningTime="2025-10-09 14:06:34.541424645 +0000 UTC m=+941.739283729" Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.546232 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.848553 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.855604 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerStarted","Data":"09bbb399871ad74fe356399ef81c235f3d3e879f2efbc76a83d8ffb879158359"} Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.855734 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-log" containerID="cri-o://7648107539a17a6742111598981f92afad08f59daebf801ec90d14ee396c2bc4" gracePeriod=30 Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.856199 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-httpd" containerID="cri-o://09bbb399871ad74fe356399ef81c235f3d3e879f2efbc76a83d8ffb879158359" gracePeriod=30 Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.859802 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerStarted","Data":"dd7442cde78646287e4f8b30a044e3446fd930c76c7f39ef9de1524abda013e8"} Oct 09 14:06:34 crc kubenswrapper[4902]: I1009 14:06:34.879022 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.878996853 podStartE2EDuration="14.878996853s" podCreationTimestamp="2025-10-09 14:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:34.87395311 +0000 UTC m=+942.071812194" watchObservedRunningTime="2025-10-09 14:06:34.878996853 +0000 UTC m=+942.076855937" Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.745653 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.874734 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerStarted","Data":"b130d1ec682a3095108f2f648767fe70d179728d61a4c2ba1e20093d4ba09cdf"} Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878008 4902 generic.go:334] "Generic (PLEG): container finished" podID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerID="09bbb399871ad74fe356399ef81c235f3d3e879f2efbc76a83d8ffb879158359" exitCode=143 Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878040 4902 generic.go:334] "Generic (PLEG): container finished" podID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerID="7648107539a17a6742111598981f92afad08f59daebf801ec90d14ee396c2bc4" exitCode=143 Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878062 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerDied","Data":"09bbb399871ad74fe356399ef81c235f3d3e879f2efbc76a83d8ffb879158359"} Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878089 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerDied","Data":"7648107539a17a6742111598981f92afad08f59daebf801ec90d14ee396c2bc4"} Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878099 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37","Type":"ContainerDied","Data":"c484c0b6ad59a404ab18f5630eef3d7e80e2a387470536ba00e3f93e3573f913"} Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.878109 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c484c0b6ad59a404ab18f5630eef3d7e80e2a387470536ba00e3f93e3573f913" Oct 09 14:06:35 crc kubenswrapper[4902]: I1009 14:06:35.991056 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.104976 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105385 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105456 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105494 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9jf\" (UniqueName: \"kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105550 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105592 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.105620 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs\") pod \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\" (UID: \"56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37\") " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.106224 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs" (OuterVolumeSpecName: "logs") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.106284 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.113251 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.113311 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts" (OuterVolumeSpecName: "scripts") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.114233 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf" (OuterVolumeSpecName: "kube-api-access-lq9jf") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "kube-api-access-lq9jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.167619 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.169130 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data" (OuterVolumeSpecName: "config-data") pod "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" (UID: "56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.207880 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.207920 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.207931 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.207943 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.207966 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9jf\" (UniqueName: \"kubernetes.io/projected/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-kube-api-access-lq9jf\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.208008 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.208023 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.234972 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.310178 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.891921 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerStarted","Data":"38966405f5eca6e013afe0970100993a4d7351ac70d474393e4e4495484ced29"} Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.894891 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.895024 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerStarted","Data":"dcc8d525cfd7ea074939c35aeeffd6b00b2a53c9c63c970bd438d28a81ac2e40"} Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.928770 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.9287552999999997 podStartE2EDuration="3.9287553s" podCreationTimestamp="2025-10-09 14:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:36.925881513 +0000 UTC m=+944.123740587" watchObservedRunningTime="2025-10-09 14:06:36.9287553 +0000 UTC m=+944.126614364" Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.961579 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:36 crc kubenswrapper[4902]: I1009 14:06:36.969681 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.002494 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:37 crc kubenswrapper[4902]: E1009 14:06:37.002992 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-httpd" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.003016 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-httpd" Oct 09 14:06:37 crc kubenswrapper[4902]: E1009 14:06:37.003036 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-log" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.003044 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-log" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.003287 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-httpd" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.003312 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" containerName="glance-log" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.006362 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.008507 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.023587 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.026367 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126017 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126073 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126110 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhj2\" (UniqueName: \"kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126144 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126228 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126257 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126445 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.126538 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.132353 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4fhrn"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.133787 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.137039 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5rqgj" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.138010 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.139295 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.140138 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4fhrn"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228520 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228587 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228692 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228729 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228755 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228925 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.228983 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229050 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229083 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpn2\" (UniqueName: \"kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229122 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229148 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhj2\" (UniqueName: \"kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229237 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229327 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229541 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.229598 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.230182 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.230355 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.234504 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.237021 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.237975 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.238533 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.250091 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhj2\" (UniqueName: \"kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.266561 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330555 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330598 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330642 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330661 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330725 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpn2\" (UniqueName: \"kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.330756 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.334924 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.335019 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.338764 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.341147 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.348585 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.348944 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.348974 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-92hj5"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.350004 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.355636 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.356203 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-92hj5"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.356442 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xcghf" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.365430 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpn2\" (UniqueName: \"kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2\") pod \"cinder-db-sync-4fhrn\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.437935 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.437992 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.438122 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjsx\" (UniqueName: \"kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.448256 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-r8v96"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.450476 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.501577 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-r8v96"] Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.502217 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.510277 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.510501 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.512023 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zr62m" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.567087 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37" path="/var/lib/kubelet/pods/56e3bd81-2ed5-4cdc-a302-90e5c3e6bb37/volumes" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569504 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmt6p\" (UniqueName: \"kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569548 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569579 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569605 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569692 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.569722 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjsx\" (UniqueName: \"kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.598876 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.603436 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjsx\" (UniqueName: \"kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.619226 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle\") pod \"barbican-db-sync-92hj5\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.659512 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.672627 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.672997 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmt6p\" (UniqueName: \"kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.673546 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.678086 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.708563 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.718123 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmt6p\" (UniqueName: \"kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p\") pod \"neutron-db-sync-r8v96\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.777871 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r8v96" Oct 09 14:06:37 crc kubenswrapper[4902]: I1009 14:06:37.803816 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92hj5" Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.056857 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.099881 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4fhrn"] Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.317751 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-r8v96"] Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.436707 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-92hj5"] Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.480710 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.480759 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.607580 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.608522 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:06:38 crc kubenswrapper[4902]: E1009 14:06:38.818985 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9fd56c3_4837_4fad_b343_b2edc61b0605.slice/crio-2d4ad1f7db7bac1656d7fa585602c537d6b102be11fcf14fc440f3379faa01c9.scope\": RecentStats: unable to find data in memory cache]" Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.950402 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerStarted","Data":"80f885796e73ca0ce8be0415482addcd608699b6b71094b9fa453f29baecca83"} Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.952257 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4fhrn" event={"ID":"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87","Type":"ContainerStarted","Data":"270b18a11addca0ddd7926e5b50aed580dfaa03d49aef744edd3cd7c516e33ab"} Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.961203 4902 generic.go:334] "Generic (PLEG): container finished" podID="f9fd56c3-4837-4fad-b343-b2edc61b0605" containerID="2d4ad1f7db7bac1656d7fa585602c537d6b102be11fcf14fc440f3379faa01c9" exitCode=0 Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.961270 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-74gvb" event={"ID":"f9fd56c3-4837-4fad-b343-b2edc61b0605","Type":"ContainerDied","Data":"2d4ad1f7db7bac1656d7fa585602c537d6b102be11fcf14fc440f3379faa01c9"} Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.966286 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r8v96" event={"ID":"0b9c3435-39c9-4af3-bbf9-70faafc22a3e","Type":"ContainerStarted","Data":"3699212da17a1fe693ce60b6829ec80bec60e69618797d80f1b8a3a680dab960"} Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.966335 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r8v96" event={"ID":"0b9c3435-39c9-4af3-bbf9-70faafc22a3e","Type":"ContainerStarted","Data":"d26c3ed389cd65f67b558798772b042e90479c50331d6672350bf197e1110fc8"} Oct 09 14:06:38 crc kubenswrapper[4902]: I1009 14:06:38.969060 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92hj5" event={"ID":"4015da27-1c19-4eb1-af33-74e182b53aa3","Type":"ContainerStarted","Data":"29a14620bdf17086809d40ebdbd6e2b116828656e7ab347c4b5b6782b29aab8c"} Oct 09 14:06:39 crc kubenswrapper[4902]: I1009 14:06:39.013249 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-r8v96" podStartSLOduration=2.013230761 podStartE2EDuration="2.013230761s" podCreationTimestamp="2025-10-09 14:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:39.01090021 +0000 UTC m=+946.208759284" watchObservedRunningTime="2025-10-09 14:06:39.013230761 +0000 UTC m=+946.211089825" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.002933 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerStarted","Data":"f7475656ccb20336e3b423389b7188ec718c5ae07b185134adb6e5e9966b5384"} Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.003218 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerStarted","Data":"171b1007fe2c2f721edd193b56bde992a201c79388afa8b60146353e7c33b6c5"} Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.485260 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539328 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539459 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539531 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmst5\" (UniqueName: \"kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539646 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539676 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.539727 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data\") pod \"f9fd56c3-4837-4fad-b343-b2edc61b0605\" (UID: \"f9fd56c3-4837-4fad-b343-b2edc61b0605\") " Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.550107 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5" (OuterVolumeSpecName: "kube-api-access-jmst5") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "kube-api-access-jmst5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.554364 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.554579 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.557088 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts" (OuterVolumeSpecName: "scripts") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.588599 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.600275 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data" (OuterVolumeSpecName: "config-data") pod "f9fd56c3-4837-4fad-b343-b2edc61b0605" (UID: "f9fd56c3-4837-4fad-b343-b2edc61b0605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642485 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642532 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmst5\" (UniqueName: \"kubernetes.io/projected/f9fd56c3-4837-4fad-b343-b2edc61b0605-kube-api-access-jmst5\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642548 4902 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642564 4902 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642576 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:40 crc kubenswrapper[4902]: I1009 14:06:40.642588 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9fd56c3-4837-4fad-b343-b2edc61b0605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.033772 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-74gvb" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.034029 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-74gvb" event={"ID":"f9fd56c3-4837-4fad-b343-b2edc61b0605","Type":"ContainerDied","Data":"67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4"} Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.034072 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67dfce0b0170926c59e820577a840bea6a046f8a5832d9173698585b7d4fe6c4" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.059172 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.059150231 podStartE2EDuration="5.059150231s" podCreationTimestamp="2025-10-09 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:06:41.058807841 +0000 UTC m=+948.256666915" watchObservedRunningTime="2025-10-09 14:06:41.059150231 +0000 UTC m=+948.257009295" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.109299 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d55545c5f-lff8v"] Oct 09 14:06:41 crc kubenswrapper[4902]: E1009 14:06:41.109687 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9fd56c3-4837-4fad-b343-b2edc61b0605" containerName="keystone-bootstrap" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.109703 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9fd56c3-4837-4fad-b343-b2edc61b0605" containerName="keystone-bootstrap" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.109875 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9fd56c3-4837-4fad-b343-b2edc61b0605" containerName="keystone-bootstrap" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.110458 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.118942 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.119141 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.119151 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.119327 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.119475 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.119549 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ws99n" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.134393 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d55545c5f-lff8v"] Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150130 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-fernet-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150180 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-scripts\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150223 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-config-data\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150249 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-internal-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150316 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrdq\" (UniqueName: \"kubernetes.io/projected/b32b124f-e090-4e6c-b2b7-138e1059b680-kube-api-access-cjrdq\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150357 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-combined-ca-bundle\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150431 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-public-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.150506 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-credential-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.251841 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-fernet-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.251904 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-scripts\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.251939 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-config-data\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.251967 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-internal-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.252034 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrdq\" (UniqueName: \"kubernetes.io/projected/b32b124f-e090-4e6c-b2b7-138e1059b680-kube-api-access-cjrdq\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.252078 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-combined-ca-bundle\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.252142 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-public-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.252169 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-credential-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.258608 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-internal-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.258829 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-config-data\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.263440 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-scripts\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.264860 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-credential-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.265082 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-public-tls-certs\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.266744 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-combined-ca-bundle\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.280104 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrdq\" (UniqueName: \"kubernetes.io/projected/b32b124f-e090-4e6c-b2b7-138e1059b680-kube-api-access-cjrdq\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.280473 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32b124f-e090-4e6c-b2b7-138e1059b680-fernet-keys\") pod \"keystone-6d55545c5f-lff8v\" (UID: \"b32b124f-e090-4e6c-b2b7-138e1059b680\") " pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.465108 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:06:41 crc kubenswrapper[4902]: I1009 14:06:41.881094 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d55545c5f-lff8v"] Oct 09 14:06:43 crc kubenswrapper[4902]: I1009 14:06:43.909120 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:43 crc kubenswrapper[4902]: I1009 14:06:43.909445 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:43 crc kubenswrapper[4902]: I1009 14:06:43.946138 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:43 crc kubenswrapper[4902]: I1009 14:06:43.963753 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:44 crc kubenswrapper[4902]: I1009 14:06:44.069281 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:44 crc kubenswrapper[4902]: I1009 14:06:44.069316 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:46 crc kubenswrapper[4902]: I1009 14:06:46.102834 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:06:46 crc kubenswrapper[4902]: I1009 14:06:46.103157 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:06:46 crc kubenswrapper[4902]: I1009 14:06:46.529162 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:46 crc kubenswrapper[4902]: I1009 14:06:46.684342 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 14:06:47 crc kubenswrapper[4902]: I1009 14:06:47.341941 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 14:06:47 crc kubenswrapper[4902]: I1009 14:06:47.341993 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 14:06:47 crc kubenswrapper[4902]: I1009 14:06:47.380093 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 14:06:47 crc kubenswrapper[4902]: I1009 14:06:47.385881 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 14:06:48 crc kubenswrapper[4902]: I1009 14:06:48.119402 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 14:06:48 crc kubenswrapper[4902]: I1009 14:06:48.119745 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 14:06:48 crc kubenswrapper[4902]: I1009 14:06:48.482114 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 14:06:48 crc kubenswrapper[4902]: I1009 14:06:48.608208 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-779d95f9fb-tfjvq" podUID="40e0f94d-30a4-456b-bfd4-7da1453facc4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 09 14:06:48 crc kubenswrapper[4902]: W1009 14:06:48.725284 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb32b124f_e090_4e6c_b2b7_138e1059b680.slice/crio-7608c810cd92d69b3c8df1f89c09e4241d166130f65f6988cb5a288eb5cd4636 WatchSource:0}: Error finding container 7608c810cd92d69b3c8df1f89c09e4241d166130f65f6988cb5a288eb5cd4636: Status 404 returned error can't find the container with id 7608c810cd92d69b3c8df1f89c09e4241d166130f65f6988cb5a288eb5cd4636 Oct 09 14:06:49 crc kubenswrapper[4902]: I1009 14:06:49.134749 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d55545c5f-lff8v" event={"ID":"b32b124f-e090-4e6c-b2b7-138e1059b680","Type":"ContainerStarted","Data":"7608c810cd92d69b3c8df1f89c09e4241d166130f65f6988cb5a288eb5cd4636"} Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.078091 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.078425 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.078475 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.079125 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.079186 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726" gracePeriod=600 Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.442034 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.442118 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:06:50 crc kubenswrapper[4902]: I1009 14:06:50.446323 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 14:06:51 crc kubenswrapper[4902]: I1009 14:06:51.157320 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726" exitCode=0 Oct 09 14:06:51 crc kubenswrapper[4902]: I1009 14:06:51.158117 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726"} Oct 09 14:06:51 crc kubenswrapper[4902]: I1009 14:06:51.158155 4902 scope.go:117] "RemoveContainer" containerID="e23319de8b44d2e9bc647fc2e977cf773ec98bd13c09e93b25a2e7f2c57468fd" Oct 09 14:06:53 crc kubenswrapper[4902]: E1009 14:06:53.169912 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 09 14:06:53 crc kubenswrapper[4902]: E1009 14:06:53.171261 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4kjsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-92hj5_openstack(4015da27-1c19-4eb1-af33-74e182b53aa3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:06:53 crc kubenswrapper[4902]: E1009 14:06:53.172552 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-92hj5" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" Oct 09 14:06:54 crc kubenswrapper[4902]: E1009 14:06:54.196690 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-92hj5" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" Oct 09 14:07:00 crc kubenswrapper[4902]: I1009 14:07:00.376068 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:07:00 crc kubenswrapper[4902]: I1009 14:07:00.498350 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:07:01 crc kubenswrapper[4902]: E1009 14:07:01.316392 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 09 14:07:01 crc kubenswrapper[4902]: E1009 14:07:01.316784 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hpn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-4fhrn_openstack(e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:07:01 crc kubenswrapper[4902]: E1009 14:07:01.319762 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-4fhrn" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.144099 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.269068 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7"} Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.270701 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d55545c5f-lff8v" event={"ID":"b32b124f-e090-4e6c-b2b7-138e1059b680","Type":"ContainerStarted","Data":"570aa65cccb25b5e862b50a04c8d28f004cecd56eec7a9250370530a5d679d28"} Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.270866 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.271977 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqrks" event={"ID":"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5","Type":"ContainerStarted","Data":"3c7bb102af997ffe7fa43482ff49863637e171d1dfcf85b433bb21af2e67ed44"} Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.275006 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerStarted","Data":"9e1f7058fd69a09f0c3076a99e1c0e6b9dea0b8cb7bdd80dc51a8ad8ffcb3259"} Oct 09 14:07:02 crc kubenswrapper[4902]: E1009 14:07:02.276587 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-4fhrn" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.312555 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jqrks" podStartSLOduration=3.38709658 podStartE2EDuration="48.312536578s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="2025-10-09 14:06:16.411117379 +0000 UTC m=+923.608976443" lastFinishedPulling="2025-10-09 14:07:01.336557377 +0000 UTC m=+968.534416441" observedRunningTime="2025-10-09 14:07:02.31030852 +0000 UTC m=+969.508167594" watchObservedRunningTime="2025-10-09 14:07:02.312536578 +0000 UTC m=+969.510395642" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.333497 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d55545c5f-lff8v" podStartSLOduration=21.333480843 podStartE2EDuration="21.333480843s" podCreationTimestamp="2025-10-09 14:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:02.330305937 +0000 UTC m=+969.528165001" watchObservedRunningTime="2025-10-09 14:07:02.333480843 +0000 UTC m=+969.531339907" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.363001 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-779d95f9fb-tfjvq" Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.425051 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.425346 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon-log" containerID="cri-o://c56bd3765dfec2fab257f46ddb8a13f723af38fda399398302589b4e0d3a8b34" gracePeriod=30 Oct 09 14:07:02 crc kubenswrapper[4902]: I1009 14:07:02.425854 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" containerID="cri-o://26401dfab58bf8d5306f89cdd551a4189699bc71c0c67bffa6d8975b03e98b74" gracePeriod=30 Oct 09 14:07:04 crc kubenswrapper[4902]: E1009 14:07:04.111896 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5b0e71_d4ef_4d63_b6ee_15ebdcb82812.slice/crio-conmon-4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5b0e71_d4ef_4d63_b6ee_15ebdcb82812.slice/crio-4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5b0e71_d4ef_4d63_b6ee_15ebdcb82812.slice/crio-conmon-706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181.scope\": RecentStats: unable to find data in memory cache]" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.304792 4902 generic.go:334] "Generic (PLEG): container finished" podID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerID="4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375" exitCode=137 Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.305119 4902 generic.go:334] "Generic (PLEG): container finished" podID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerID="706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181" exitCode=137 Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.304882 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerDied","Data":"4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.305209 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerDied","Data":"706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.308850 4902 generic.go:334] "Generic (PLEG): container finished" podID="2bfb704d-cf57-4181-b6d1-c5884492984a" containerID="112d547199273642f0137b5c314d1495df860c7a46c5089daf5caaba0a916bdd" exitCode=137 Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.308909 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659b68cf89-l9cwg" event={"ID":"2bfb704d-cf57-4181-b6d1-c5884492984a","Type":"ContainerDied","Data":"112d547199273642f0137b5c314d1495df860c7a46c5089daf5caaba0a916bdd"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.308931 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-659b68cf89-l9cwg" event={"ID":"2bfb704d-cf57-4181-b6d1-c5884492984a","Type":"ContainerDied","Data":"5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.308941 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cc832bee442dad9f03a0d0214aa5ea3e38a37a3ee8ffbe87b5358f68ea95e7b" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.315423 4902 generic.go:334] "Generic (PLEG): container finished" podID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerID="85ec85e5a9c2d8fc19be2078040e1e8b57cda146fb710e97c4707c25a486e6c3" exitCode=137 Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.315460 4902 generic.go:334] "Generic (PLEG): container finished" podID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerID="d32f94923cc95a2e0f28f7ecd26583aaa7c93c89987d48381807c1c01d46f019" exitCode=137 Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.315483 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerDied","Data":"85ec85e5a9c2d8fc19be2078040e1e8b57cda146fb710e97c4707c25a486e6c3"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.315511 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerDied","Data":"d32f94923cc95a2e0f28f7ecd26583aaa7c93c89987d48381807c1c01d46f019"} Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.376725 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.378386 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.423774 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426202 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data\") pod \"2bfb704d-cf57-4181-b6d1-c5884492984a\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426320 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs\") pod \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426401 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key\") pod \"2bfb704d-cf57-4181-b6d1-c5884492984a\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426456 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key\") pod \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426515 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts\") pod \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426569 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqlvt\" (UniqueName: \"kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt\") pod \"2bfb704d-cf57-4181-b6d1-c5884492984a\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426618 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs\") pod \"2bfb704d-cf57-4181-b6d1-c5884492984a\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426656 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data\") pod \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426690 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts\") pod \"2bfb704d-cf57-4181-b6d1-c5884492984a\" (UID: \"2bfb704d-cf57-4181-b6d1-c5884492984a\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.426783 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8j87\" (UniqueName: \"kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87\") pod \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\" (UID: \"de054665-327f-4e4f-b23c-d2f2ebc3bd04\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.427086 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs" (OuterVolumeSpecName: "logs") pod "de054665-327f-4e4f-b23c-d2f2ebc3bd04" (UID: "de054665-327f-4e4f-b23c-d2f2ebc3bd04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.427754 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de054665-327f-4e4f-b23c-d2f2ebc3bd04-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.431928 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2bfb704d-cf57-4181-b6d1-c5884492984a" (UID: "2bfb704d-cf57-4181-b6d1-c5884492984a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.432033 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt" (OuterVolumeSpecName: "kube-api-access-qqlvt") pod "2bfb704d-cf57-4181-b6d1-c5884492984a" (UID: "2bfb704d-cf57-4181-b6d1-c5884492984a"). InnerVolumeSpecName "kube-api-access-qqlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.432248 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs" (OuterVolumeSpecName: "logs") pod "2bfb704d-cf57-4181-b6d1-c5884492984a" (UID: "2bfb704d-cf57-4181-b6d1-c5884492984a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.436655 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "de054665-327f-4e4f-b23c-d2f2ebc3bd04" (UID: "de054665-327f-4e4f-b23c-d2f2ebc3bd04"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.438605 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87" (OuterVolumeSpecName: "kube-api-access-m8j87") pod "de054665-327f-4e4f-b23c-d2f2ebc3bd04" (UID: "de054665-327f-4e4f-b23c-d2f2ebc3bd04"). InnerVolumeSpecName "kube-api-access-m8j87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.455902 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data" (OuterVolumeSpecName: "config-data") pod "2bfb704d-cf57-4181-b6d1-c5884492984a" (UID: "2bfb704d-cf57-4181-b6d1-c5884492984a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.466403 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts" (OuterVolumeSpecName: "scripts") pod "2bfb704d-cf57-4181-b6d1-c5884492984a" (UID: "2bfb704d-cf57-4181-b6d1-c5884492984a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.488062 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts" (OuterVolumeSpecName: "scripts") pod "de054665-327f-4e4f-b23c-d2f2ebc3bd04" (UID: "de054665-327f-4e4f-b23c-d2f2ebc3bd04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.491329 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data" (OuterVolumeSpecName: "config-data") pod "de054665-327f-4e4f-b23c-d2f2ebc3bd04" (UID: "de054665-327f-4e4f-b23c-d2f2ebc3bd04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529386 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs\") pod \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529811 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knqgw\" (UniqueName: \"kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw\") pod \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529806 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs" (OuterVolumeSpecName: "logs") pod "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" (UID: "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529878 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts\") pod \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529923 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data\") pod \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.529980 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key\") pod \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\" (UID: \"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812\") " Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530326 4902 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2bfb704d-cf57-4181-b6d1-c5884492984a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530346 4902 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de054665-327f-4e4f-b23c-d2f2ebc3bd04-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530357 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530366 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqlvt\" (UniqueName: \"kubernetes.io/projected/2bfb704d-cf57-4181-b6d1-c5884492984a-kube-api-access-qqlvt\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530376 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bfb704d-cf57-4181-b6d1-c5884492984a-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530384 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de054665-327f-4e4f-b23c-d2f2ebc3bd04-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530393 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530404 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8j87\" (UniqueName: \"kubernetes.io/projected/de054665-327f-4e4f-b23c-d2f2ebc3bd04-kube-api-access-m8j87\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530428 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bfb704d-cf57-4181-b6d1-c5884492984a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.530436 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.534125 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw" (OuterVolumeSpecName: "kube-api-access-knqgw") pod "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" (UID: "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812"). InnerVolumeSpecName "kube-api-access-knqgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.535655 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" (UID: "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.552392 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data" (OuterVolumeSpecName: "config-data") pod "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" (UID: "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.553379 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts" (OuterVolumeSpecName: "scripts") pod "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" (UID: "cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.633556 4902 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.633588 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knqgw\" (UniqueName: \"kubernetes.io/projected/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-kube-api-access-knqgw\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.633600 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:04 crc kubenswrapper[4902]: I1009 14:07:04.633610 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.327518 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f8f55c9df-4zdc6" event={"ID":"cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812","Type":"ContainerDied","Data":"71c782eac08995ce249f78bbd12627a2151f96311460f3c0f3cabb572478715b"} Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.327569 4902 scope.go:117] "RemoveContainer" containerID="4d770277fa0754de0f4f9239e672a17537f94165530ef07bbc2e40a706ddf375" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.327566 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f8f55c9df-4zdc6" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.329302 4902 generic.go:334] "Generic (PLEG): container finished" podID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" containerID="3c7bb102af997ffe7fa43482ff49863637e171d1dfcf85b433bb21af2e67ed44" exitCode=0 Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.329359 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqrks" event={"ID":"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5","Type":"ContainerDied","Data":"3c7bb102af997ffe7fa43482ff49863637e171d1dfcf85b433bb21af2e67ed44"} Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.333019 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c8d64bf69-wprv5" event={"ID":"de054665-327f-4e4f-b23c-d2f2ebc3bd04","Type":"ContainerDied","Data":"fe722b5fc3df156fdfc35c34c1bc441eb5672590cba5abe482f0898bd6a682de"} Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.333203 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c8d64bf69-wprv5" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.334891 4902 generic.go:334] "Generic (PLEG): container finished" podID="0b9c3435-39c9-4af3-bbf9-70faafc22a3e" containerID="3699212da17a1fe693ce60b6829ec80bec60e69618797d80f1b8a3a680dab960" exitCode=0 Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.334987 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-659b68cf89-l9cwg" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.338692 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r8v96" event={"ID":"0b9c3435-39c9-4af3-bbf9-70faafc22a3e","Type":"ContainerDied","Data":"3699212da17a1fe693ce60b6829ec80bec60e69618797d80f1b8a3a680dab960"} Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.377296 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.387615 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f8f55c9df-4zdc6"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.452980 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.463967 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-659b68cf89-l9cwg"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.471297 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.478058 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c8d64bf69-wprv5"] Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.528620 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" path="/var/lib/kubelet/pods/2bfb704d-cf57-4181-b6d1-c5884492984a/volumes" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.529329 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" path="/var/lib/kubelet/pods/cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812/volumes" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.530054 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" path="/var/lib/kubelet/pods/de054665-327f-4e4f-b23c-d2f2ebc3bd04/volumes" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.555265 4902 scope.go:117] "RemoveContainer" containerID="706c5e6c2db4698ef619ee48cbd03331704890ccac523b93eae571f762390181" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.577139 4902 scope.go:117] "RemoveContainer" containerID="85ec85e5a9c2d8fc19be2078040e1e8b57cda146fb710e97c4707c25a486e6c3" Oct 09 14:07:05 crc kubenswrapper[4902]: I1009 14:07:05.762678 4902 scope.go:117] "RemoveContainer" containerID="d32f94923cc95a2e0f28f7ecd26583aaa7c93c89987d48381807c1c01d46f019" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.349472 4902 generic.go:334] "Generic (PLEG): container finished" podID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerID="26401dfab58bf8d5306f89cdd551a4189699bc71c0c67bffa6d8975b03e98b74" exitCode=0 Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.349690 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerDied","Data":"26401dfab58bf8d5306f89cdd551a4189699bc71c0c67bffa6d8975b03e98b74"} Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.719359 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r8v96" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.724075 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqrks" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.877738 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle\") pod \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.877802 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs\") pod \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.877837 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle\") pod \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.877936 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87l49\" (UniqueName: \"kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49\") pod \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.877992 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts\") pod \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.878080 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config\") pod \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.878156 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmt6p\" (UniqueName: \"kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p\") pod \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\" (UID: \"0b9c3435-39c9-4af3-bbf9-70faafc22a3e\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.878207 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data\") pod \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\" (UID: \"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5\") " Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.878251 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs" (OuterVolumeSpecName: "logs") pod "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" (UID: "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.878691 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.884445 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts" (OuterVolumeSpecName: "scripts") pod "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" (UID: "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.884621 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49" (OuterVolumeSpecName: "kube-api-access-87l49") pod "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" (UID: "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5"). InnerVolumeSpecName "kube-api-access-87l49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.896877 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p" (OuterVolumeSpecName: "kube-api-access-nmt6p") pod "0b9c3435-39c9-4af3-bbf9-70faafc22a3e" (UID: "0b9c3435-39c9-4af3-bbf9-70faafc22a3e"). InnerVolumeSpecName "kube-api-access-nmt6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.905234 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b9c3435-39c9-4af3-bbf9-70faafc22a3e" (UID: "0b9c3435-39c9-4af3-bbf9-70faafc22a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.909296 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data" (OuterVolumeSpecName: "config-data") pod "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" (UID: "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.909388 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" (UID: "d3545e73-7c0f-4e1a-b012-da9e7f35a0b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.917573 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config" (OuterVolumeSpecName: "config") pod "0b9c3435-39c9-4af3-bbf9-70faafc22a3e" (UID: "0b9c3435-39c9-4af3-bbf9-70faafc22a3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980120 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980156 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmt6p\" (UniqueName: \"kubernetes.io/projected/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-kube-api-access-nmt6p\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980173 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980186 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b9c3435-39c9-4af3-bbf9-70faafc22a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980198 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980209 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87l49\" (UniqueName: \"kubernetes.io/projected/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-kube-api-access-87l49\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:06 crc kubenswrapper[4902]: I1009 14:07:06.980219 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.365992 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-r8v96" event={"ID":"0b9c3435-39c9-4af3-bbf9-70faafc22a3e","Type":"ContainerDied","Data":"d26c3ed389cd65f67b558798772b042e90479c50331d6672350bf197e1110fc8"} Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.366326 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d26c3ed389cd65f67b558798772b042e90479c50331d6672350bf197e1110fc8" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.366060 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-r8v96" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.368331 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jqrks" event={"ID":"d3545e73-7c0f-4e1a-b012-da9e7f35a0b5","Type":"ContainerDied","Data":"601aea65acd2c438cace694800601499864f32344289002a385a8e94950e0189"} Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.368359 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601aea65acd2c438cace694800601499864f32344289002a385a8e94950e0189" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.368431 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jqrks" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.437589 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-864577bbd8-z8v7t"] Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.437929 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.437944 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.437954 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9c3435-39c9-4af3-bbf9-70faafc22a3e" containerName="neutron-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.437961 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9c3435-39c9-4af3-bbf9-70faafc22a3e" containerName="neutron-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.437974 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.437980 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.437994 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438000 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.438017 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" containerName="placement-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438022 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" containerName="placement-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.438033 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438039 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: E1009 14:07:07.438055 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438060 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438238 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438249 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfb704d-cf57-4181-b6d1-c5884492984a" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438258 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9c3435-39c9-4af3-bbf9-70faafc22a3e" containerName="neutron-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438270 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5b0e71-d4ef-4d63-b6ee-15ebdcb82812" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438282 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" containerName="placement-db-sync" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438293 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.438301 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="de054665-327f-4e4f-b23c-d2f2ebc3bd04" containerName="horizon-log" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.439162 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.443960 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.444190 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xphx7" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.448592 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.448821 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.448981 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.453555 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-864577bbd8-z8v7t"] Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.589317 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590211 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-config-data\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590322 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-internal-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590358 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-logs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590468 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-scripts\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590516 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-public-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590563 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-combined-ca-bundle\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.590602 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxpl\" (UniqueName: \"kubernetes.io/projected/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-kube-api-access-pcxpl\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.593401 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.620131 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695330 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695386 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-scripts\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695430 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695460 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695487 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-public-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695514 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695542 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-combined-ca-bundle\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695582 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxpl\" (UniqueName: \"kubernetes.io/projected/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-kube-api-access-pcxpl\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695654 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695679 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9xm2\" (UniqueName: \"kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695709 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-config-data\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695762 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-internal-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.695787 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-logs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.697271 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-logs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.703572 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-internal-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.703630 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-config-data\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.703596 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-combined-ca-bundle\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.706355 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.706604 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-public-tls-certs\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.707877 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.712584 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zr62m" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.714243 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.714506 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.715769 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.716189 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxpl\" (UniqueName: \"kubernetes.io/projected/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-kube-api-access-pcxpl\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.720358 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8ddce2-e9c0-4dc6-8bcd-d228188630dc-scripts\") pod \"placement-864577bbd8-z8v7t\" (UID: \"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc\") " pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.746815 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.782833 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798007 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgdlc\" (UniqueName: \"kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798070 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798128 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798150 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798174 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798227 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798281 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.798332 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.799597 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.799667 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.799742 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.800397 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.800474 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.800496 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9xm2\" (UniqueName: \"kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.800612 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.801177 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.825312 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9xm2\" (UniqueName: \"kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2\") pod \"dnsmasq-dns-84b966f6c9-d6xgh\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.901697 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgdlc\" (UniqueName: \"kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.901758 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.901806 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.901854 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.901875 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.906223 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.908077 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.918257 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.923181 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.924958 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgdlc\" (UniqueName: \"kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc\") pod \"neutron-5cf9b8867d-9zfjl\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:07 crc kubenswrapper[4902]: I1009 14:07:07.927466 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:08 crc kubenswrapper[4902]: I1009 14:07:08.102270 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:08 crc kubenswrapper[4902]: I1009 14:07:08.480096 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.451061 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-65878bc9b7-hv97v"] Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.452832 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.456647 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.456865 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.474534 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65878bc9b7-hv97v"] Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541005 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-internal-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541133 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-ovndb-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541161 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-httpd-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541176 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698f2\" (UniqueName: \"kubernetes.io/projected/b17f63fc-0163-416e-a3ee-179a1a071560-kube-api-access-698f2\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541217 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-public-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541268 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.541293 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-combined-ca-bundle\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.642616 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-internal-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.643528 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-ovndb-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.643585 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-httpd-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.643606 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698f2\" (UniqueName: \"kubernetes.io/projected/b17f63fc-0163-416e-a3ee-179a1a071560-kube-api-access-698f2\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.644041 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-public-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.644854 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.644943 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-combined-ca-bundle\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.647766 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-internal-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.650052 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-httpd-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.656395 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-ovndb-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.660568 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-config\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.662337 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-combined-ca-bundle\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.663191 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698f2\" (UniqueName: \"kubernetes.io/projected/b17f63fc-0163-416e-a3ee-179a1a071560-kube-api-access-698f2\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.677152 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b17f63fc-0163-416e-a3ee-179a1a071560-public-tls-certs\") pod \"neutron-65878bc9b7-hv97v\" (UID: \"b17f63fc-0163-416e-a3ee-179a1a071560\") " pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:09 crc kubenswrapper[4902]: I1009 14:07:09.798270 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:13 crc kubenswrapper[4902]: I1009 14:07:13.238968 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d55545c5f-lff8v" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.317983 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.319194 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.321731 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.321879 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.324144 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jnvbj" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.330092 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.439886 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.440275 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.440324 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmbp\" (UniqueName: \"kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.440426 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.543026 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.544399 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.544496 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmbp\" (UniqueName: \"kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.544644 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.545357 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.547647 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: E1009 14:07:14.548396 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-sqmbp openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="5effc1bd-82a1-4515-82f5-be488340fa94" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.549610 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: E1009 14:07:14.549898 4902 projected.go:194] Error preparing data for projected volume kube-api-access-sqmbp for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 09 14:07:14 crc kubenswrapper[4902]: E1009 14:07:14.549969 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp podName:5effc1bd-82a1-4515-82f5-be488340fa94 nodeName:}" failed. No retries permitted until 2025-10-09 14:07:15.049949557 +0000 UTC m=+982.247808621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sqmbp" (UniqueName: "kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp") pod "openstackclient" (UID: "5effc1bd-82a1-4515-82f5-be488340fa94") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.552685 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.571189 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.589305 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.590756 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.618220 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.646330 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.646503 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.646541 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config-secret\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.646606 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsm5d\" (UniqueName: \"kubernetes.io/projected/594a9127-c741-4bb1-871f-0295abab43ce-kube-api-access-wsm5d\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.750946 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.751001 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config-secret\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.751061 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsm5d\" (UniqueName: \"kubernetes.io/projected/594a9127-c741-4bb1-871f-0295abab43ce-kube-api-access-wsm5d\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.751151 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.751913 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.766908 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-combined-ca-bundle\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.767015 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/594a9127-c741-4bb1-871f-0295abab43ce-openstack-config-secret\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.789615 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsm5d\" (UniqueName: \"kubernetes.io/projected/594a9127-c741-4bb1-871f-0295abab43ce-kube-api-access-wsm5d\") pod \"openstackclient\" (UID: \"594a9127-c741-4bb1-871f-0295abab43ce\") " pod="openstack/openstackclient" Oct 09 14:07:14 crc kubenswrapper[4902]: I1009 14:07:14.974351 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.057798 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmbp\" (UniqueName: \"kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp\") pod \"openstackclient\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " pod="openstack/openstackclient" Oct 09 14:07:15 crc kubenswrapper[4902]: E1009 14:07:15.061470 4902 projected.go:194] Error preparing data for projected volume kube-api-access-sqmbp for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (5effc1bd-82a1-4515-82f5-be488340fa94) does not match the UID in record. The object might have been deleted and then recreated Oct 09 14:07:15 crc kubenswrapper[4902]: E1009 14:07:15.061812 4902 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp podName:5effc1bd-82a1-4515-82f5-be488340fa94 nodeName:}" failed. No retries permitted until 2025-10-09 14:07:16.061787721 +0000 UTC m=+983.259646785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sqmbp" (UniqueName: "kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp") pod "openstackclient" (UID: "5effc1bd-82a1-4515-82f5-be488340fa94") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (5effc1bd-82a1-4515-82f5-be488340fa94) does not match the UID in record. The object might have been deleted and then recreated Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.444778 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.459673 4902 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5effc1bd-82a1-4515-82f5-be488340fa94" podUID="594a9127-c741-4bb1-871f-0295abab43ce" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.463668 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.577390 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle\") pod \"5effc1bd-82a1-4515-82f5-be488340fa94\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.577906 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret\") pod \"5effc1bd-82a1-4515-82f5-be488340fa94\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.577944 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config\") pod \"5effc1bd-82a1-4515-82f5-be488340fa94\" (UID: \"5effc1bd-82a1-4515-82f5-be488340fa94\") " Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.578515 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqmbp\" (UniqueName: \"kubernetes.io/projected/5effc1bd-82a1-4515-82f5-be488340fa94-kube-api-access-sqmbp\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.587354 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5effc1bd-82a1-4515-82f5-be488340fa94" (UID: "5effc1bd-82a1-4515-82f5-be488340fa94"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.587528 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5effc1bd-82a1-4515-82f5-be488340fa94" (UID: "5effc1bd-82a1-4515-82f5-be488340fa94"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.589604 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5effc1bd-82a1-4515-82f5-be488340fa94" (UID: "5effc1bd-82a1-4515-82f5-be488340fa94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.680602 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.681073 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:15 crc kubenswrapper[4902]: I1009 14:07:15.681133 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5effc1bd-82a1-4515-82f5-be488340fa94-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.036152 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.248756 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-864577bbd8-z8v7t"] Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.338675 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65878bc9b7-hv97v"] Oct 09 14:07:16 crc kubenswrapper[4902]: W1009 14:07:16.353705 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb17f63fc_0163_416e_a3ee_179a1a071560.slice/crio-e0f806b604e577f0d2f854151af1742260cb3210fe78a290d91a096170b89a11 WatchSource:0}: Error finding container e0f806b604e577f0d2f854151af1742260cb3210fe78a290d91a096170b89a11: Status 404 returned error can't find the container with id e0f806b604e577f0d2f854151af1742260cb3210fe78a290d91a096170b89a11 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.370595 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.462383 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerStarted","Data":"a1d7843c226cfdd9c60d2d2e9655a2b097439cd5edf5071eb5a18a2e755b6636"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.462546 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-central-agent" containerID="cri-o://119944c304e40ad4b30208df944e3587aea6322eb5c80031e9db64894b4ec2b9" gracePeriod=30 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.462587 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="sg-core" containerID="cri-o://9e1f7058fd69a09f0c3076a99e1c0e6b9dea0b8cb7bdd80dc51a8ad8ffcb3259" gracePeriod=30 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.462588 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="proxy-httpd" containerID="cri-o://a1d7843c226cfdd9c60d2d2e9655a2b097439cd5edf5071eb5a18a2e755b6636" gracePeriod=30 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.462657 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-notification-agent" containerID="cri-o://38966405f5eca6e013afe0970100993a4d7351ac70d474393e4e4495484ced29" gracePeriod=30 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.463158 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.466608 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65878bc9b7-hv97v" event={"ID":"b17f63fc-0163-416e-a3ee-179a1a071560","Type":"ContainerStarted","Data":"e0f806b604e577f0d2f854151af1742260cb3210fe78a290d91a096170b89a11"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.471367 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92hj5" event={"ID":"4015da27-1c19-4eb1-af33-74e182b53aa3","Type":"ContainerStarted","Data":"fed6f04acf067026499dd56f180834119dc95b524b9117a96995cbf1bba9ddb4"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.476260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-864577bbd8-z8v7t" event={"ID":"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc","Type":"ContainerStarted","Data":"77139cc0bcf2f8133fc206c12a47aa5309fe9a7297aa5fc4ac2f3d2cca4db465"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.488368 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"594a9127-c741-4bb1-871f-0295abab43ce","Type":"ContainerStarted","Data":"b586da6a54efd8770a35662cec0ab865056bd1fc9b03b7edb2f585577c8d4f30"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.494077 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.911668469 podStartE2EDuration="1m2.494058724s" podCreationTimestamp="2025-10-09 14:06:14 +0000 UTC" firstStartedPulling="2025-10-09 14:06:16.048432849 +0000 UTC m=+923.246291913" lastFinishedPulling="2025-10-09 14:07:15.630823104 +0000 UTC m=+982.828682168" observedRunningTime="2025-10-09 14:07:16.485593132 +0000 UTC m=+983.683452216" watchObservedRunningTime="2025-10-09 14:07:16.494058724 +0000 UTC m=+983.691917788" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.503077 4902 generic.go:334] "Generic (PLEG): container finished" podID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerID="13f0365bf9d78b914e6a2543e3bddb74578658f11dafdb9513cdc2d283ccc4da" exitCode=0 Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.503194 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.503187 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" event={"ID":"f101e38c-9520-43d9-b911-8fc2fdc4459c","Type":"ContainerDied","Data":"13f0365bf9d78b914e6a2543e3bddb74578658f11dafdb9513cdc2d283ccc4da"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.503246 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" event={"ID":"f101e38c-9520-43d9-b911-8fc2fdc4459c","Type":"ContainerStarted","Data":"01ca2bb7a1e4b549674f1cfd52dc4250d12f4e37373a5dc39dfb7d571e710952"} Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.518688 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-92hj5" podStartSLOduration=2.566576884 podStartE2EDuration="39.518661395s" podCreationTimestamp="2025-10-09 14:06:37 +0000 UTC" firstStartedPulling="2025-10-09 14:06:38.460113465 +0000 UTC m=+945.657972529" lastFinishedPulling="2025-10-09 14:07:15.412197976 +0000 UTC m=+982.610057040" observedRunningTime="2025-10-09 14:07:16.50165949 +0000 UTC m=+983.699518574" watchObservedRunningTime="2025-10-09 14:07:16.518661395 +0000 UTC m=+983.716520469" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.546095 4902 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5effc1bd-82a1-4515-82f5-be488340fa94" podUID="594a9127-c741-4bb1-871f-0295abab43ce" Oct 09 14:07:16 crc kubenswrapper[4902]: I1009 14:07:16.926765 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:16 crc kubenswrapper[4902]: W1009 14:07:16.941980 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5eb5ddd_7b3d_4392_9555_44eaf6e54c51.slice/crio-2600a6398b27b5ba83ef9bff7c37f3dbd91d24aebd484f9148ebb3d217adecfb WatchSource:0}: Error finding container 2600a6398b27b5ba83ef9bff7c37f3dbd91d24aebd484f9148ebb3d217adecfb: Status 404 returned error can't find the container with id 2600a6398b27b5ba83ef9bff7c37f3dbd91d24aebd484f9148ebb3d217adecfb Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.526198 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5effc1bd-82a1-4515-82f5-be488340fa94" path="/var/lib/kubelet/pods/5effc1bd-82a1-4515-82f5-be488340fa94/volumes" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.526714 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4fhrn" event={"ID":"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87","Type":"ContainerStarted","Data":"589e134088889030f30963da599dce0475033fd7998c03b92d4dd6135c809d8a"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.530928 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" event={"ID":"f101e38c-9520-43d9-b911-8fc2fdc4459c","Type":"ContainerStarted","Data":"8a9b53dbfc92bf8b04978fedbbd6146c3cf26aca74cf46f5b8ff35200d034125"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.531267 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.538275 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerStarted","Data":"9e0e6e5110d94fa5b8e19549d272d8ecadaf0019be7ba855fc88fdc188455cd3"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.538330 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerStarted","Data":"764530cdc309d34f5e4d9f50471c07cfdf1e04fc5e535bcdd92eedabca38c008"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.538344 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerStarted","Data":"2600a6398b27b5ba83ef9bff7c37f3dbd91d24aebd484f9148ebb3d217adecfb"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.540355 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.548930 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4fhrn" podStartSLOduration=2.630929083 podStartE2EDuration="40.548908699s" podCreationTimestamp="2025-10-09 14:06:37 +0000 UTC" firstStartedPulling="2025-10-09 14:06:38.154160866 +0000 UTC m=+945.352019930" lastFinishedPulling="2025-10-09 14:07:16.072140482 +0000 UTC m=+983.269999546" observedRunningTime="2025-10-09 14:07:17.543987033 +0000 UTC m=+984.741846117" watchObservedRunningTime="2025-10-09 14:07:17.548908699 +0000 UTC m=+984.746767763" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.553679 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65878bc9b7-hv97v" event={"ID":"b17f63fc-0163-416e-a3ee-179a1a071560","Type":"ContainerStarted","Data":"3d5953d158259274bdc39dbdb08b5d08b1efbb586fcfc0062404e5692bbeaf32"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.553925 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65878bc9b7-hv97v" event={"ID":"b17f63fc-0163-416e-a3ee-179a1a071560","Type":"ContainerStarted","Data":"da0c7e53282b7aaf5f7839dc612e8957574d143ceb50e4399b888768d246f4d7"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.555207 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557453 4902 generic.go:334] "Generic (PLEG): container finished" podID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerID="a1d7843c226cfdd9c60d2d2e9655a2b097439cd5edf5071eb5a18a2e755b6636" exitCode=0 Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557545 4902 generic.go:334] "Generic (PLEG): container finished" podID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerID="9e1f7058fd69a09f0c3076a99e1c0e6b9dea0b8cb7bdd80dc51a8ad8ffcb3259" exitCode=2 Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557599 4902 generic.go:334] "Generic (PLEG): container finished" podID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerID="119944c304e40ad4b30208df944e3587aea6322eb5c80031e9db64894b4ec2b9" exitCode=0 Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557679 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerDied","Data":"a1d7843c226cfdd9c60d2d2e9655a2b097439cd5edf5071eb5a18a2e755b6636"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557745 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerDied","Data":"9e1f7058fd69a09f0c3076a99e1c0e6b9dea0b8cb7bdd80dc51a8ad8ffcb3259"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.557817 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerDied","Data":"119944c304e40ad4b30208df944e3587aea6322eb5c80031e9db64894b4ec2b9"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.571019 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-864577bbd8-z8v7t" event={"ID":"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc","Type":"ContainerStarted","Data":"9d42a08ebd37c959de25f11deb4971553bd43fe15dd0d0a20c6d089eb8f97b17"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.571071 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-864577bbd8-z8v7t" event={"ID":"9e8ddce2-e9c0-4dc6-8bcd-d228188630dc","Type":"ContainerStarted","Data":"834edf35fc6218f6e33d3dc6f820af20602fc036c1e66da8ba797e346de7fb1a"} Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.571993 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.572020 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.578381 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cf9b8867d-9zfjl" podStartSLOduration=10.578363534 podStartE2EDuration="10.578363534s" podCreationTimestamp="2025-10-09 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:17.574550641 +0000 UTC m=+984.772409705" watchObservedRunningTime="2025-10-09 14:07:17.578363534 +0000 UTC m=+984.776222598" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.604283 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" podStartSLOduration=10.604248074000001 podStartE2EDuration="10.604248074s" podCreationTimestamp="2025-10-09 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:17.60177785 +0000 UTC m=+984.799636924" watchObservedRunningTime="2025-10-09 14:07:17.604248074 +0000 UTC m=+984.802107138" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.629226 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-864577bbd8-z8v7t" podStartSLOduration=10.629205476 podStartE2EDuration="10.629205476s" podCreationTimestamp="2025-10-09 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:17.623357562 +0000 UTC m=+984.821216636" watchObservedRunningTime="2025-10-09 14:07:17.629205476 +0000 UTC m=+984.827064550" Oct 09 14:07:17 crc kubenswrapper[4902]: I1009 14:07:17.653274 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-65878bc9b7-hv97v" podStartSLOduration=8.65325215 podStartE2EDuration="8.65325215s" podCreationTimestamp="2025-10-09 14:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:17.64683529 +0000 UTC m=+984.844694364" watchObservedRunningTime="2025-10-09 14:07:17.65325215 +0000 UTC m=+984.851111214" Oct 09 14:07:18 crc kubenswrapper[4902]: I1009 14:07:18.480912 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 14:07:18 crc kubenswrapper[4902]: I1009 14:07:18.609966 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/0.log" Oct 09 14:07:18 crc kubenswrapper[4902]: I1009 14:07:18.611730 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerID="9e0e6e5110d94fa5b8e19549d272d8ecadaf0019be7ba855fc88fdc188455cd3" exitCode=1 Oct 09 14:07:18 crc kubenswrapper[4902]: I1009 14:07:18.611880 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerDied","Data":"9e0e6e5110d94fa5b8e19549d272d8ecadaf0019be7ba855fc88fdc188455cd3"} Oct 09 14:07:18 crc kubenswrapper[4902]: I1009 14:07:18.612511 4902 scope.go:117] "RemoveContainer" containerID="9e0e6e5110d94fa5b8e19549d272d8ecadaf0019be7ba855fc88fdc188455cd3" Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.629063 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/1.log" Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.630647 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/0.log" Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.631358 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerID="15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7" exitCode=1 Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.631429 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerDied","Data":"15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7"} Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.631494 4902 scope.go:117] "RemoveContainer" containerID="9e0e6e5110d94fa5b8e19549d272d8ecadaf0019be7ba855fc88fdc188455cd3" Oct 09 14:07:19 crc kubenswrapper[4902]: I1009 14:07:19.632220 4902 scope.go:117] "RemoveContainer" containerID="15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7" Oct 09 14:07:19 crc kubenswrapper[4902]: E1009 14:07:19.632446 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-5cf9b8867d-9zfjl_openstack(f5eb5ddd-7b3d-4392-9555-44eaf6e54c51)\"" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" Oct 09 14:07:20 crc kubenswrapper[4902]: I1009 14:07:20.641745 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/1.log" Oct 09 14:07:20 crc kubenswrapper[4902]: I1009 14:07:20.642965 4902 scope.go:117] "RemoveContainer" containerID="15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7" Oct 09 14:07:20 crc kubenswrapper[4902]: E1009 14:07:20.643341 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-5cf9b8867d-9zfjl_openstack(f5eb5ddd-7b3d-4392-9555-44eaf6e54c51)\"" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" Oct 09 14:07:21 crc kubenswrapper[4902]: I1009 14:07:21.660930 4902 generic.go:334] "Generic (PLEG): container finished" podID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerID="38966405f5eca6e013afe0970100993a4d7351ac70d474393e4e4495484ced29" exitCode=0 Oct 09 14:07:21 crc kubenswrapper[4902]: I1009 14:07:21.660988 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerDied","Data":"38966405f5eca6e013afe0970100993a4d7351ac70d474393e4e4495484ced29"} Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.395647 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6568f7cff-cv7qx"] Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.397943 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.399611 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.401326 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.401696 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.406284 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6568f7cff-cv7qx"] Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.456804 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-public-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.456871 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-config-data\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.456942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-log-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.457003 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btskb\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-kube-api-access-btskb\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.457040 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-etc-swift\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.457064 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-internal-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.457100 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-run-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.457148 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-combined-ca-bundle\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558536 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-combined-ca-bundle\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558675 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-public-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558712 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-config-data\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558750 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-log-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558811 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btskb\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-kube-api-access-btskb\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558849 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-etc-swift\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558883 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-internal-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.558931 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-run-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.560051 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-run-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.560261 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/797c027a-6081-4aa8-9643-ddffc4393193-log-httpd\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.565546 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-combined-ca-bundle\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.566095 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-internal-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.566948 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-public-tls-certs\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.575218 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-etc-swift\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.576197 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/797c027a-6081-4aa8-9643-ddffc4393193-config-data\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.579997 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btskb\" (UniqueName: \"kubernetes.io/projected/797c027a-6081-4aa8-9643-ddffc4393193-kube-api-access-btskb\") pod \"swift-proxy-6568f7cff-cv7qx\" (UID: \"797c027a-6081-4aa8-9643-ddffc4393193\") " pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.720683 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:22 crc kubenswrapper[4902]: I1009 14:07:22.935549 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:23 crc kubenswrapper[4902]: I1009 14:07:23.014554 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:07:23 crc kubenswrapper[4902]: I1009 14:07:23.014828 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="dnsmasq-dns" containerID="cri-o://22f4e750bfe871e2f676c151db24c9040a256ac1c93d8b9e68e0c7352637ee5c" gracePeriod=10 Oct 09 14:07:23 crc kubenswrapper[4902]: I1009 14:07:23.684950 4902 generic.go:334] "Generic (PLEG): container finished" podID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerID="22f4e750bfe871e2f676c151db24c9040a256ac1c93d8b9e68e0c7352637ee5c" exitCode=0 Oct 09 14:07:23 crc kubenswrapper[4902]: I1009 14:07:23.685034 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" event={"ID":"1f63d2cd-4740-45c6-a94b-26899b7ffa86","Type":"ContainerDied","Data":"22f4e750bfe871e2f676c151db24c9040a256ac1c93d8b9e68e0c7352637ee5c"} Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.377691 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.443850 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssx75\" (UniqueName: \"kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444011 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444086 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444114 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444132 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444180 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.444207 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle\") pod \"8fe438bd-0b47-4495-93a8-590e4019a7c6\" (UID: \"8fe438bd-0b47-4495-93a8-590e4019a7c6\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.447961 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.448705 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.452661 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75" (OuterVolumeSpecName: "kube-api-access-ssx75") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "kube-api-access-ssx75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.454828 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts" (OuterVolumeSpecName: "scripts") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.468803 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.503308 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547026 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547194 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547306 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547352 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64qp6\" (UniqueName: \"kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547440 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.547477 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb\") pod \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\" (UID: \"1f63d2cd-4740-45c6-a94b-26899b7ffa86\") " Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.548004 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.548022 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.548035 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.548046 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fe438bd-0b47-4495-93a8-590e4019a7c6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.548056 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssx75\" (UniqueName: \"kubernetes.io/projected/8fe438bd-0b47-4495-93a8-590e4019a7c6-kube-api-access-ssx75\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.553916 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6" (OuterVolumeSpecName: "kube-api-access-64qp6") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "kube-api-access-64qp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.580499 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data" (OuterVolumeSpecName: "config-data") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.639330 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config" (OuterVolumeSpecName: "config") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.644308 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.645929 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe438bd-0b47-4495-93a8-590e4019a7c6" (UID: "8fe438bd-0b47-4495-93a8-590e4019a7c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.646890 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.649955 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.649998 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.650013 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe438bd-0b47-4495-93a8-590e4019a7c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.650027 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.650044 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64qp6\" (UniqueName: \"kubernetes.io/projected/1f63d2cd-4740-45c6-a94b-26899b7ffa86-kube-api-access-64qp6\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.650055 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.655690 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.666583 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f63d2cd-4740-45c6-a94b-26899b7ffa86" (UID: "1f63d2cd-4740-45c6-a94b-26899b7ffa86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.714345 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6568f7cff-cv7qx"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.716224 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"594a9127-c741-4bb1-871f-0295abab43ce","Type":"ContainerStarted","Data":"ce603115743118815ccaec082ab0e6b9275ae59c816dfcd721d89872ef8f0336"} Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.728448 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" event={"ID":"1f63d2cd-4740-45c6-a94b-26899b7ffa86","Type":"ContainerDied","Data":"23c15b61828b97429fbc5a0aa062d802bd857a6dd73e7013d34f8cd558b67ca8"} Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.728837 4902 scope.go:117] "RemoveContainer" containerID="22f4e750bfe871e2f676c151db24c9040a256ac1c93d8b9e68e0c7352637ee5c" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.729135 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-pfr9v" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.748859 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fe438bd-0b47-4495-93a8-590e4019a7c6","Type":"ContainerDied","Data":"81f9d18967778391a4fc19ac465b4203e280e125d6241fd127507b3c2bcf2775"} Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.748900 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.749307 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.042116824 podStartE2EDuration="11.749293361s" podCreationTimestamp="2025-10-09 14:07:14 +0000 UTC" firstStartedPulling="2025-10-09 14:07:16.375559731 +0000 UTC m=+983.573418795" lastFinishedPulling="2025-10-09 14:07:25.082736268 +0000 UTC m=+992.280595332" observedRunningTime="2025-10-09 14:07:25.744670933 +0000 UTC m=+992.942530007" watchObservedRunningTime="2025-10-09 14:07:25.749293361 +0000 UTC m=+992.947152425" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.752857 4902 generic.go:334] "Generic (PLEG): container finished" podID="4015da27-1c19-4eb1-af33-74e182b53aa3" containerID="fed6f04acf067026499dd56f180834119dc95b524b9117a96995cbf1bba9ddb4" exitCode=0 Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.752916 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92hj5" event={"ID":"4015da27-1c19-4eb1-af33-74e182b53aa3","Type":"ContainerDied","Data":"fed6f04acf067026499dd56f180834119dc95b524b9117a96995cbf1bba9ddb4"} Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.753465 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.753483 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f63d2cd-4740-45c6-a94b-26899b7ffa86-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.773626 4902 scope.go:117] "RemoveContainer" containerID="6d8db348cc6f0ce0b975e23dd8277864a2c4228157c8902f9bec4ac0016fbc26" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.804304 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.814018 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-pfr9v"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.819445 4902 scope.go:117] "RemoveContainer" containerID="a1d7843c226cfdd9c60d2d2e9655a2b097439cd5edf5071eb5a18a2e755b6636" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.832567 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.844587 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.853442 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.853987 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-central-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854009 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-central-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.854031 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="proxy-httpd" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854041 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="proxy-httpd" Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.854068 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="sg-core" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854078 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="sg-core" Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.854106 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="dnsmasq-dns" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854115 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="dnsmasq-dns" Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.854139 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="init" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854147 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="init" Oct 09 14:07:25 crc kubenswrapper[4902]: E1009 14:07:25.854157 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-notification-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854167 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-notification-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854436 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-notification-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854455 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="sg-core" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854475 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="proxy-httpd" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854506 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" containerName="ceilometer-central-agent" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.854525 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" containerName="dnsmasq-dns" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.856885 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.859218 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.861577 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.869167 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.873107 4902 scope.go:117] "RemoveContainer" containerID="9e1f7058fd69a09f0c3076a99e1c0e6b9dea0b8cb7bdd80dc51a8ad8ffcb3259" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.899985 4902 scope.go:117] "RemoveContainer" containerID="38966405f5eca6e013afe0970100993a4d7351ac70d474393e4e4495484ced29" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.930210 4902 scope.go:117] "RemoveContainer" containerID="119944c304e40ad4b30208df944e3587aea6322eb5c80031e9db64894b4ec2b9" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.956703 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.956804 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.956883 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.956942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnv2b\" (UniqueName: \"kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.956991 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.957049 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:25 crc kubenswrapper[4902]: I1009 14:07:25.957094 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.058787 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.058858 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnv2b\" (UniqueName: \"kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.058906 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.058956 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.058999 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.059041 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.059088 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.059572 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.060106 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.063519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.064104 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.068254 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.070938 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.085974 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnv2b\" (UniqueName: \"kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b\") pod \"ceilometer-0\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.180755 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.466613 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.467124 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-log" containerID="cri-o://171b1007fe2c2f721edd193b56bde992a201c79388afa8b60146353e7c33b6c5" gracePeriod=30 Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.467287 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-httpd" containerID="cri-o://f7475656ccb20336e3b423389b7188ec718c5ae07b185134adb6e5e9966b5384" gracePeriod=30 Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.764768 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6568f7cff-cv7qx" event={"ID":"797c027a-6081-4aa8-9643-ddffc4393193","Type":"ContainerStarted","Data":"8c36d9422df68f39a42038a705e4ed80681848a4463bf54f7508e59ec37558be"} Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.764821 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6568f7cff-cv7qx" event={"ID":"797c027a-6081-4aa8-9643-ddffc4393193","Type":"ContainerStarted","Data":"b5d870f635475acb344c799b0b78ece75701c040b0054f0c7f56ce81cc99bfc6"} Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.764839 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.764852 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6568f7cff-cv7qx" event={"ID":"797c027a-6081-4aa8-9643-ddffc4393193","Type":"ContainerStarted","Data":"a24a1efedd7a7e03ef51ce08aa34d24e12aef02b74c0d97e8f10e19246b69d85"} Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.764880 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.778667 4902 generic.go:334] "Generic (PLEG): container finished" podID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerID="171b1007fe2c2f721edd193b56bde992a201c79388afa8b60146353e7c33b6c5" exitCode=143 Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.778809 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerDied","Data":"171b1007fe2c2f721edd193b56bde992a201c79388afa8b60146353e7c33b6c5"} Oct 09 14:07:26 crc kubenswrapper[4902]: I1009 14:07:26.789778 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.219342 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92hj5" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.239533 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6568f7cff-cv7qx" podStartSLOduration=5.239512216 podStartE2EDuration="5.239512216s" podCreationTimestamp="2025-10-09 14:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:26.801706743 +0000 UTC m=+993.999565807" watchObservedRunningTime="2025-10-09 14:07:27.239512216 +0000 UTC m=+994.437371270" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.286781 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kjsx\" (UniqueName: \"kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx\") pod \"4015da27-1c19-4eb1-af33-74e182b53aa3\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.286966 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle\") pod \"4015da27-1c19-4eb1-af33-74e182b53aa3\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.287038 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data\") pod \"4015da27-1c19-4eb1-af33-74e182b53aa3\" (UID: \"4015da27-1c19-4eb1-af33-74e182b53aa3\") " Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.295674 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4015da27-1c19-4eb1-af33-74e182b53aa3" (UID: "4015da27-1c19-4eb1-af33-74e182b53aa3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.308475 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx" (OuterVolumeSpecName: "kube-api-access-4kjsx") pod "4015da27-1c19-4eb1-af33-74e182b53aa3" (UID: "4015da27-1c19-4eb1-af33-74e182b53aa3"). InnerVolumeSpecName "kube-api-access-4kjsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.328214 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4015da27-1c19-4eb1-af33-74e182b53aa3" (UID: "4015da27-1c19-4eb1-af33-74e182b53aa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.389030 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.389063 4902 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4015da27-1c19-4eb1-af33-74e182b53aa3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.389073 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kjsx\" (UniqueName: \"kubernetes.io/projected/4015da27-1c19-4eb1-af33-74e182b53aa3-kube-api-access-4kjsx\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.529825 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f63d2cd-4740-45c6-a94b-26899b7ffa86" path="/var/lib/kubelet/pods/1f63d2cd-4740-45c6-a94b-26899b7ffa86/volumes" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.530909 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe438bd-0b47-4495-93a8-590e4019a7c6" path="/var/lib/kubelet/pods/8fe438bd-0b47-4495-93a8-590e4019a7c6/volumes" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.551723 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.552127 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-log" containerID="cri-o://b130d1ec682a3095108f2f648767fe70d179728d61a4c2ba1e20093d4ba09cdf" gracePeriod=30 Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.552717 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-httpd" containerID="cri-o://dcc8d525cfd7ea074939c35aeeffd6b00b2a53c9c63c970bd438d28a81ac2e40" gracePeriod=30 Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.788205 4902 generic.go:334] "Generic (PLEG): container finished" podID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" containerID="589e134088889030f30963da599dce0475033fd7998c03b92d4dd6135c809d8a" exitCode=0 Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.788290 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4fhrn" event={"ID":"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87","Type":"ContainerDied","Data":"589e134088889030f30963da599dce0475033fd7998c03b92d4dd6135c809d8a"} Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.790524 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerStarted","Data":"f2f5c2e8f29e84af1b4fb2b3129fa51f289f4e3d68c5af374be43fbcabf6c96f"} Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.790564 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerStarted","Data":"b856f9246eaeef6d60fc2d32d67f1be48430848cd1d7f41539f998631cd30752"} Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.792181 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-92hj5" event={"ID":"4015da27-1c19-4eb1-af33-74e182b53aa3","Type":"ContainerDied","Data":"29a14620bdf17086809d40ebdbd6e2b116828656e7ab347c4b5b6782b29aab8c"} Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.792220 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a14620bdf17086809d40ebdbd6e2b116828656e7ab347c4b5b6782b29aab8c" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.792300 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-92hj5" Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.796307 4902 generic.go:334] "Generic (PLEG): container finished" podID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerID="b130d1ec682a3095108f2f648767fe70d179728d61a4c2ba1e20093d4ba09cdf" exitCode=143 Oct 09 14:07:27 crc kubenswrapper[4902]: I1009 14:07:27.796376 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerDied","Data":"b130d1ec682a3095108f2f648767fe70d179728d61a4c2ba1e20093d4ba09cdf"} Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.182828 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-896cb696f-kkg85"] Oct 09 14:07:28 crc kubenswrapper[4902]: E1009 14:07:28.183352 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" containerName="barbican-db-sync" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.183371 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" containerName="barbican-db-sync" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.183675 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" containerName="barbican-db-sync" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.184855 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.187696 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.187900 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xcghf" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.188067 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.208913 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-combined-ca-bundle\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.209027 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data-custom\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.209096 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/fe538ee2-2e8c-406f-8e70-bc56325ec408-kube-api-access-m6jrq\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.209190 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe538ee2-2e8c-406f-8e70-bc56325ec408-logs\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.209245 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.231609 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7dbd9b6574-b5dht"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.233779 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.240712 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.270834 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-896cb696f-kkg85"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.315182 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-combined-ca-bundle\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.322672 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data-custom\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.322805 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/fe538ee2-2e8c-406f-8e70-bc56325ec408-kube-api-access-m6jrq\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.322862 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.322942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323009 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe538ee2-2e8c-406f-8e70-bc56325ec408-logs\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323080 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data-custom\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323104 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323124 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjffk\" (UniqueName: \"kubernetes.io/projected/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-kube-api-access-zjffk\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323158 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-logs\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323192 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-combined-ca-bundle\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.323582 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe538ee2-2e8c-406f-8e70-bc56325ec408-logs\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.353655 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7dbd9b6574-b5dht"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.375823 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.381061 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe538ee2-2e8c-406f-8e70-bc56325ec408-config-data-custom\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.392668 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.394702 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.418201 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jrq\" (UniqueName: \"kubernetes.io/projected/fe538ee2-2e8c-406f-8e70-bc56325ec408-kube-api-access-m6jrq\") pod \"barbican-worker-896cb696f-kkg85\" (UID: \"fe538ee2-2e8c-406f-8e70-bc56325ec408\") " pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.424718 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.424925 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425011 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6ng\" (UniqueName: \"kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425096 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425439 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data-custom\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425511 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjffk\" (UniqueName: \"kubernetes.io/projected/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-kube-api-access-zjffk\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425543 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425565 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-logs\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425625 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425684 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.425784 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.426239 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-logs\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.434834 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.435462 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-config-data-custom\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.437272 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-combined-ca-bundle\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.445800 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjffk\" (UniqueName: \"kubernetes.io/projected/11b3f7c7-66a5-485c-922e-b5568e2f9f1c-kube-api-access-zjffk\") pod \"barbican-keystone-listener-7dbd9b6574-b5dht\" (UID: \"11b3f7c7-66a5-485c-922e-b5568e2f9f1c\") " pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.450059 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.463504 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.466662 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.468632 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.479343 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.481419 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b9b5fc8fb-nlcpz" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.481583 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.514395 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-896cb696f-kkg85" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527379 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527540 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527578 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527657 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527693 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6ng\" (UniqueName: \"kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527718 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527748 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw22p\" (UniqueName: \"kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527776 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527858 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527910 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.527961 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.529075 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.529842 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.531676 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.531706 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.532625 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.556899 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6ng\" (UniqueName: \"kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng\") pod \"dnsmasq-dns-75c8ddd69c-gjtkj\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.570845 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.630014 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.630229 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.630264 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw22p\" (UniqueName: \"kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.630313 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.630426 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.631878 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.638961 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.639807 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.640677 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.654579 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw22p\" (UniqueName: \"kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p\") pod \"barbican-api-7479d98b6d-qm6k2\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.810823 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.853565 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:28 crc kubenswrapper[4902]: I1009 14:07:28.944134 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-896cb696f-kkg85"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.268139 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7dbd9b6574-b5dht"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.474130 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:07:29 crc kubenswrapper[4902]: W1009 14:07:29.533563 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e61af2_6a2e_4fe8_9e42_57d6303411f8.slice/crio-d8c5690a0269bfabdbb1ea9608d3446374b0314787e97a1624041ba062f84084 WatchSource:0}: Error finding container d8c5690a0269bfabdbb1ea9608d3446374b0314787e97a1624041ba062f84084: Status 404 returned error can't find the container with id d8c5690a0269bfabdbb1ea9608d3446374b0314787e97a1624041ba062f84084 Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558127 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558168 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558188 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558242 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558326 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpn2\" (UniqueName: \"kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.558461 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle\") pod \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\" (UID: \"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87\") " Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.559387 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.568589 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts" (OuterVolumeSpecName: "scripts") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.570523 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.570576 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.571132 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2" (OuterVolumeSpecName: "kube-api-access-2hpn2") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "kube-api-access-2hpn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.594588 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.638133 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.658730 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data" (OuterVolumeSpecName: "config-data") pod "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" (UID: "e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.664952 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.664979 4902 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.664992 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.665001 4902 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.665010 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.665019 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpn2\" (UniqueName: \"kubernetes.io/projected/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87-kube-api-access-2hpn2\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.700127 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.843837 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-896cb696f-kkg85" event={"ID":"fe538ee2-2e8c-406f-8e70-bc56325ec408","Type":"ContainerStarted","Data":"5419c1f37caf344f58c98c8eac51449270810018d6bffeb2ad3462c5dfdf8f41"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.859682 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" event={"ID":"13a3ac18-a408-4c88-8dfc-d04b509941d1","Type":"ContainerStarted","Data":"af432a2e114c80ad87bd5d146ee2170f1addf34d7892b09269395f921e449604"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.885007 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerStarted","Data":"d8c5690a0269bfabdbb1ea9608d3446374b0314787e97a1624041ba062f84084"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.911851 4902 generic.go:334] "Generic (PLEG): container finished" podID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerID="f7475656ccb20336e3b423389b7188ec718c5ae07b185134adb6e5e9966b5384" exitCode=0 Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.912111 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerDied","Data":"f7475656ccb20336e3b423389b7188ec718c5ae07b185134adb6e5e9966b5384"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.926097 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jn679"] Oct 09 14:07:29 crc kubenswrapper[4902]: E1009 14:07:29.926631 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" containerName="cinder-db-sync" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.943494 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" containerName="cinder-db-sync" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.943994 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" containerName="cinder-db-sync" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.944798 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jn679"] Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.944929 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.949693 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" event={"ID":"11b3f7c7-66a5-485c-922e-b5568e2f9f1c","Type":"ContainerStarted","Data":"85222ca30308f79bf137b8271419914f8ebe781f98bb8873e69d0b90e3925ed9"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.968112 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4fhrn" event={"ID":"e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87","Type":"ContainerDied","Data":"270b18a11addca0ddd7926e5b50aed580dfaa03d49aef744edd3cd7c516e33ab"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.968157 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270b18a11addca0ddd7926e5b50aed580dfaa03d49aef744edd3cd7c516e33ab" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.968265 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4fhrn" Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.979720 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerStarted","Data":"e7f24a4e051a495560a0dd23e83b7fb1fd844bfdb6b9c51f6dde90c1ae50d2e4"} Oct 09 14:07:29 crc kubenswrapper[4902]: I1009 14:07:29.984223 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwf2t\" (UniqueName: \"kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t\") pod \"nova-api-db-create-jn679\" (UID: \"80011324-6959-416f-a2e0-1ea80e7b32fd\") " pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.032599 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nr94d"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.038957 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.092605 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwf2t\" (UniqueName: \"kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t\") pod \"nova-api-db-create-jn679\" (UID: \"80011324-6959-416f-a2e0-1ea80e7b32fd\") " pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.092771 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvmm\" (UniqueName: \"kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm\") pod \"nova-cell0-db-create-nr94d\" (UID: \"ace7c5ce-f1fc-4600-9764-0e357b80f849\") " pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.127102 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nr94d"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.158796 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwf2t\" (UniqueName: \"kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t\") pod \"nova-api-db-create-jn679\" (UID: \"80011324-6959-416f-a2e0-1ea80e7b32fd\") " pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.191219 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zjfzf"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.203228 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.203644 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvmm\" (UniqueName: \"kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm\") pod \"nova-cell0-db-create-nr94d\" (UID: \"ace7c5ce-f1fc-4600-9764-0e357b80f849\") " pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.225674 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zjfzf"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.239168 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.241368 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.254844 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.276819 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.300592 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvmm\" (UniqueName: \"kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm\") pod \"nova-cell0-db-create-nr94d\" (UID: \"ace7c5ce-f1fc-4600-9764-0e357b80f849\") " pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.300684 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.300961 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5rqgj" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.301133 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.301881 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306008 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306056 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306077 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wls5\" (UniqueName: \"kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5\") pod \"nova-cell1-db-create-zjfzf\" (UID: \"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a\") " pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306119 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvskq\" (UniqueName: \"kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306138 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306159 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.306199 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.347561 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.363059 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.364601 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.376132 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417383 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvskq\" (UniqueName: \"kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417463 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417490 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417535 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417646 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417685 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqg42\" (UniqueName: \"kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417720 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417746 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417788 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417811 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417831 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417847 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wls5\" (UniqueName: \"kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5\") pod \"nova-cell1-db-create-zjfzf\" (UID: \"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a\") " pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.417878 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.421018 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.426784 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.426865 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.428988 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.431332 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.435706 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.447961 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.448463 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.450149 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvskq\" (UniqueName: \"kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq\") pod \"cinder-scheduler-0\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.463132 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.469827 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wls5\" (UniqueName: \"kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5\") pod \"nova-cell1-db-create-zjfzf\" (UID: \"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a\") " pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.520438 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.520525 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.520549 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521436 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521469 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521531 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521579 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqg42\" (UniqueName: \"kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521626 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521666 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521705 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czstb\" (UniqueName: \"kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521757 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521799 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.521824 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.522602 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.523293 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.525530 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.525894 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.526038 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.544634 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.545673 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqg42\" (UniqueName: \"kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42\") pod \"dnsmasq-dns-5784cf869f-s2w6w\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.613272 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630528 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630600 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630624 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630673 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630693 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630759 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czstb\" (UniqueName: \"kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.630804 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.632636 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.635116 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.635202 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.635360 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.646629 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.660748 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.660949 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czstb\" (UniqueName: \"kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb\") pod \"cinder-api-0\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " pod="openstack/cinder-api-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.698866 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.744635 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:30 crc kubenswrapper[4902]: I1009 14:07:30.811184 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.040051 4902 generic.go:334] "Generic (PLEG): container finished" podID="13a3ac18-a408-4c88-8dfc-d04b509941d1" containerID="1cd9447a4402a758fee034a65853c44d4f9f3fb9d70e50bda51738eea15b22e1" exitCode=0 Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.040315 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" event={"ID":"13a3ac18-a408-4c88-8dfc-d04b509941d1","Type":"ContainerDied","Data":"1cd9447a4402a758fee034a65853c44d4f9f3fb9d70e50bda51738eea15b22e1"} Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.072800 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerStarted","Data":"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34"} Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.072860 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerStarted","Data":"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e"} Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.073946 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.073986 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.117427 4902 generic.go:334] "Generic (PLEG): container finished" podID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerID="dcc8d525cfd7ea074939c35aeeffd6b00b2a53c9c63c970bd438d28a81ac2e40" exitCode=0 Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.117816 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerDied","Data":"dcc8d525cfd7ea074939c35aeeffd6b00b2a53c9c63c970bd438d28a81ac2e40"} Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.160289 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerStarted","Data":"391c2456b125ea67f4ca26def719bb38bd937cdbbdeb68f166ae1932db02db95"} Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.191046 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7479d98b6d-qm6k2" podStartSLOduration=3.191025573 podStartE2EDuration="3.191025573s" podCreationTimestamp="2025-10-09 14:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:31.122836206 +0000 UTC m=+998.320695270" watchObservedRunningTime="2025-10-09 14:07:31.191025573 +0000 UTC m=+998.388884627" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.269774 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.315031 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nr94d"] Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.365906 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.365994 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnhj2\" (UniqueName: \"kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366051 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366090 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366180 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366261 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366313 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.366367 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data\") pod \"6c5490ed-b380-45b8-b528-e9cab5c79e62\" (UID: \"6c5490ed-b380-45b8-b528-e9cab5c79e62\") " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.377843 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.378311 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.390801 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts" (OuterVolumeSpecName: "scripts") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.390995 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2" (OuterVolumeSpecName: "kube-api-access-cnhj2") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "kube-api-access-cnhj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.392632 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.392838 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.394266 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs" (OuterVolumeSpecName: "logs") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.418998 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jn679"] Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.463532 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.463809 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.474632 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.480559 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data" (OuterVolumeSpecName: "config-data") pod "6c5490ed-b380-45b8-b528-e9cab5c79e62" (UID: "6c5490ed-b380-45b8-b528-e9cab5c79e62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.496913 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.496958 4902 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.496975 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnhj2\" (UniqueName: \"kubernetes.io/projected/6c5490ed-b380-45b8-b528-e9cab5c79e62-kube-api-access-cnhj2\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.496987 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.496998 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.497009 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5490ed-b380-45b8-b528-e9cab5c79e62-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.497020 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5490ed-b380-45b8-b528-e9cab5c79e62-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.529666 4902 scope.go:117] "RemoveContainer" containerID="15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7" Oct 09 14:07:31 crc kubenswrapper[4902]: E1009 14:07:31.594067 4902 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 09 14:07:31 crc kubenswrapper[4902]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/13a3ac18-a408-4c88-8dfc-d04b509941d1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 09 14:07:31 crc kubenswrapper[4902]: > podSandboxID="af432a2e114c80ad87bd5d146ee2170f1addf34d7892b09269395f921e449604" Oct 09 14:07:31 crc kubenswrapper[4902]: E1009 14:07:31.594263 4902 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 09 14:07:31 crc kubenswrapper[4902]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h86hd4h5f9hc8h599h5h56bh75h554h597h5f4hb7h98h58fh66ch57ch668h5bfhd8h596h68dh54h8ch674h587h5bdhb9hc4h695h5b8hccq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wc6ng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75c8ddd69c-gjtkj_openstack(13a3ac18-a408-4c88-8dfc-d04b509941d1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/13a3ac18-a408-4c88-8dfc-d04b509941d1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 09 14:07:31 crc kubenswrapper[4902]: > logger="UnhandledError" Oct 09 14:07:31 crc kubenswrapper[4902]: E1009 14:07:31.595487 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/13a3ac18-a408-4c88-8dfc-d04b509941d1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" podUID="13a3ac18-a408-4c88-8dfc-d04b509941d1" Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.781372 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zjfzf"] Oct 09 14:07:31 crc kubenswrapper[4902]: I1009 14:07:31.812872 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.000376 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.014801 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.177078 4902 generic.go:334] "Generic (PLEG): container finished" podID="80011324-6959-416f-a2e0-1ea80e7b32fd" containerID="2514a005b3622e668714ceb7bbaddf3c2ce91f45b6051763bcf877e9258ca8de" exitCode=0 Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.177147 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jn679" event={"ID":"80011324-6959-416f-a2e0-1ea80e7b32fd","Type":"ContainerDied","Data":"2514a005b3622e668714ceb7bbaddf3c2ce91f45b6051763bcf877e9258ca8de"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.177172 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jn679" event={"ID":"80011324-6959-416f-a2e0-1ea80e7b32fd","Type":"ContainerStarted","Data":"db51ce4d7a472a263e1b7a39f01bb2f8f4be5158e6525a43c5ddbf3374a4d668"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.182058 4902 generic.go:334] "Generic (PLEG): container finished" podID="ace7c5ce-f1fc-4600-9764-0e357b80f849" containerID="4e4e54afcc6992a1cdedb8011a0ce92f11d88459d6c67a20faf151cff2071790" exitCode=0 Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.182155 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nr94d" event={"ID":"ace7c5ce-f1fc-4600-9764-0e357b80f849","Type":"ContainerDied","Data":"4e4e54afcc6992a1cdedb8011a0ce92f11d88459d6c67a20faf151cff2071790"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.182192 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nr94d" event={"ID":"ace7c5ce-f1fc-4600-9764-0e357b80f849","Type":"ContainerStarted","Data":"83ce9642711637f71d75b22c1855ec531ee7df719fc0b5f5fbc23443a5e9b992"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.185186 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c5490ed-b380-45b8-b528-e9cab5c79e62","Type":"ContainerDied","Data":"80f885796e73ca0ce8be0415482addcd608699b6b71094b9fa453f29baecca83"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.185253 4902 scope.go:117] "RemoveContainer" containerID="f7475656ccb20336e3b423389b7188ec718c5ae07b185134adb6e5e9966b5384" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.185446 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.192766 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/2.log" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.194370 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/1.log" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.195203 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerID="c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e" exitCode=1 Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.195471 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerDied","Data":"c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e"} Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.196116 4902 scope.go:117] "RemoveContainer" containerID="c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e" Oct 09 14:07:32 crc kubenswrapper[4902]: E1009 14:07:32.196448 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-5cf9b8867d-9zfjl_openstack(f5eb5ddd-7b3d-4392-9555-44eaf6e54c51)\"" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.231114 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.239155 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.266904 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: E1009 14:07:32.267392 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-httpd" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.267433 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-httpd" Oct 09 14:07:32 crc kubenswrapper[4902]: E1009 14:07:32.267458 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-log" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.267466 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-log" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.267722 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-log" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.267753 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" containerName="glance-httpd" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.269072 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.273730 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.274077 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.310170 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.320968 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8w2\" (UniqueName: \"kubernetes.io/projected/9c0515fd-b685-4dab-909a-3f4147e19a59-kube-api-access-8s8w2\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321251 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321305 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321368 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-logs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321395 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321435 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321474 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.321506 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.422931 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423015 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423076 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-logs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423105 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423146 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423195 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423230 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.423302 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8w2\" (UniqueName: \"kubernetes.io/projected/9c0515fd-b685-4dab-909a-3f4147e19a59-kube-api-access-8s8w2\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.424308 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.424819 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0515fd-b685-4dab-909a-3f4147e19a59-logs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.425045 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.431037 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.431722 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.433797 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-scripts\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.438886 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0515fd-b685-4dab-909a-3f4147e19a59-config-data\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.449472 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8w2\" (UniqueName: \"kubernetes.io/projected/9c0515fd-b685-4dab-909a-3f4147e19a59-kube-api-access-8s8w2\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.457745 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9c0515fd-b685-4dab-909a-3f4147e19a59\") " pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.610013 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 14:07:32 crc kubenswrapper[4902]: W1009 14:07:32.713432 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabbbf9c3_be8f_4199_9f68_626d18441730.slice/crio-5cae291e575774553e628f5ac8f9b9cd8b9664b8661e54dbf41487e0f878874a WatchSource:0}: Error finding container 5cae291e575774553e628f5ac8f9b9cd8b9664b8661e54dbf41487e0f878874a: Status 404 returned error can't find the container with id 5cae291e575774553e628f5ac8f9b9cd8b9664b8661e54dbf41487e0f878874a Oct 09 14:07:32 crc kubenswrapper[4902]: W1009 14:07:32.732001 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bdf36f9_3d35_4ccd_9d32_86c5ab08b65a.slice/crio-053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073 WatchSource:0}: Error finding container 053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073: Status 404 returned error can't find the container with id 053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073 Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.738815 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:32 crc kubenswrapper[4902]: I1009 14:07:32.761124 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6568f7cff-cv7qx" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.210704 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" event={"ID":"abbbf9c3-be8f-4199-9f68-626d18441730","Type":"ContainerStarted","Data":"5cae291e575774553e628f5ac8f9b9cd8b9664b8661e54dbf41487e0f878874a"} Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.212625 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerStarted","Data":"81bafe6d7767acd9273cc7a00737bd0d96e423efbceb24222339d84f7d9724d2"} Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.213888 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zjfzf" event={"ID":"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a","Type":"ContainerStarted","Data":"053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073"} Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.216379 4902 generic.go:334] "Generic (PLEG): container finished" podID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerID="c56bd3765dfec2fab257f46ddb8a13f723af38fda399398302589b4e0d3a8b34" exitCode=137 Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.216574 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerDied","Data":"c56bd3765dfec2fab257f46ddb8a13f723af38fda399398302589b4e0d3a8b34"} Oct 09 14:07:33 crc kubenswrapper[4902]: W1009 14:07:33.255631 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5566f62_4d11_4503_ab45_7f2d727bc397.slice/crio-9be983c4db76de927aa30a2f56f9672d6fee758cddafdfe7201d22193ac2c90a WatchSource:0}: Error finding container 9be983c4db76de927aa30a2f56f9672d6fee758cddafdfe7201d22193ac2c90a: Status 404 returned error can't find the container with id 9be983c4db76de927aa30a2f56f9672d6fee758cddafdfe7201d22193ac2c90a Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.461400 4902 scope.go:117] "RemoveContainer" containerID="171b1007fe2c2f721edd193b56bde992a201c79388afa8b60146353e7c33b6c5" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.583638 4902 scope.go:117] "RemoveContainer" containerID="15d729eeb5c68a661f7d7d581545552e80334c34cb99e5998b14a191654c5ad7" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.602051 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.605976 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.725141 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5490ed-b380-45b8-b528-e9cab5c79e62" path="/var/lib/kubelet/pods/6c5490ed-b380-45b8-b528-e9cab5c79e62/volumes" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.733368 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.733804 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767045 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767111 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767179 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc6ng\" (UniqueName: \"kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767207 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767236 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767273 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767308 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767361 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767436 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqt7\" (UniqueName: \"kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767486 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767526 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767572 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc\") pod \"13a3ac18-a408-4c88-8dfc-d04b509941d1\" (UID: \"13a3ac18-a408-4c88-8dfc-d04b509941d1\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767609 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.767670 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run\") pod \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\" (UID: \"4731e204-78a9-4d3c-8763-ace5d7d97cf7\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.785867 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.814589 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs" (OuterVolumeSpecName: "logs") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870176 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870216 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870248 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870298 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870382 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9b9t\" (UniqueName: \"kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870567 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870615 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwf2t\" (UniqueName: \"kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t\") pod \"80011324-6959-416f-a2e0-1ea80e7b32fd\" (UID: \"80011324-6959-416f-a2e0-1ea80e7b32fd\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.870706 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs\") pod \"ba9ae197-8325-4a07-a174-31f7f2e29978\" (UID: \"ba9ae197-8325-4a07-a174-31f7f2e29978\") " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.871071 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.871082 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4731e204-78a9-4d3c-8763-ace5d7d97cf7-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.922096 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs" (OuterVolumeSpecName: "logs") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.923260 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts" (OuterVolumeSpecName: "scripts") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.923529 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng" (OuterVolumeSpecName: "kube-api-access-wc6ng") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "kube-api-access-wc6ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.923690 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.923733 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7" (OuterVolumeSpecName: "kube-api-access-4wqt7") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "kube-api-access-4wqt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.927606 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.973647 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.974182 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9ae197-8325-4a07-a174-31f7f2e29978-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.974280 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.974339 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc6ng\" (UniqueName: \"kubernetes.io/projected/13a3ac18-a408-4c88-8dfc-d04b509941d1-kube-api-access-wc6ng\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.974403 4902 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.974527 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqt7\" (UniqueName: \"kubernetes.io/projected/4731e204-78a9-4d3c-8763-ace5d7d97cf7-kube-api-access-4wqt7\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.982577 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t" (OuterVolumeSpecName: "kube-api-access-rwf2t") pod "80011324-6959-416f-a2e0-1ea80e7b32fd" (UID: "80011324-6959-416f-a2e0-1ea80e7b32fd"). InnerVolumeSpecName "kube-api-access-rwf2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:33 crc kubenswrapper[4902]: I1009 14:07:33.994759 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t" (OuterVolumeSpecName: "kube-api-access-j9b9t") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "kube-api-access-j9b9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.028773 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.048314 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.067971 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data" (OuterVolumeSpecName: "config-data") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.079700 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9b9t\" (UniqueName: \"kubernetes.io/projected/ba9ae197-8325-4a07-a174-31f7f2e29978-kube-api-access-j9b9t\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.079734 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwf2t\" (UniqueName: \"kubernetes.io/projected/80011324-6959-416f-a2e0-1ea80e7b32fd-kube-api-access-rwf2t\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.079743 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.079752 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.120070 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts" (OuterVolumeSpecName: "scripts") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.132874 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.180541 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvvmm\" (UniqueName: \"kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm\") pod \"ace7c5ce-f1fc-4600-9764-0e357b80f849\" (UID: \"ace7c5ce-f1fc-4600-9764-0e357b80f849\") " Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.181332 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ae197-8325-4a07-a174-31f7f2e29978-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.181361 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.191195 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm" (OuterVolumeSpecName: "kube-api-access-mvvmm") pod "ace7c5ce-f1fc-4600-9764-0e357b80f849" (UID: "ace7c5ce-f1fc-4600-9764-0e357b80f849"). InnerVolumeSpecName "kube-api-access-mvvmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.208056 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config" (OuterVolumeSpecName: "config") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.225350 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.231105 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.232150 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.249729 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.250210 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-gjtkj" event={"ID":"13a3ac18-a408-4c88-8dfc-d04b509941d1","Type":"ContainerDied","Data":"af432a2e114c80ad87bd5d146ee2170f1addf34d7892b09269395f921e449604"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.250256 4902 scope.go:117] "RemoveContainer" containerID="1cd9447a4402a758fee034a65853c44d4f9f3fb9d70e50bda51738eea15b22e1" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.269320 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jn679" event={"ID":"80011324-6959-416f-a2e0-1ea80e7b32fd","Type":"ContainerDied","Data":"db51ce4d7a472a263e1b7a39f01bb2f8f4be5158e6525a43c5ddbf3374a4d668"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.269739 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db51ce4d7a472a263e1b7a39f01bb2f8f4be5158e6525a43c5ddbf3374a4d668" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.269485 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jn679" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.271073 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.282823 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.282847 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.282856 4902 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.282865 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvvmm\" (UniqueName: \"kubernetes.io/projected/ace7c5ce-f1fc-4600-9764-0e357b80f849-kube-api-access-mvvmm\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.282873 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.289515 4902 generic.go:334] "Generic (PLEG): container finished" podID="abbbf9c3-be8f-4199-9f68-626d18441730" containerID="68aa74132c2dde6f0a9ce6a6a8259747f1959feff16647c4238cd18b4a621c2c" exitCode=0 Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.289849 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" event={"ID":"abbbf9c3-be8f-4199-9f68-626d18441730","Type":"ContainerDied","Data":"68aa74132c2dde6f0a9ce6a6a8259747f1959feff16647c4238cd18b4a621c2c"} Oct 09 14:07:34 crc kubenswrapper[4902]: W1009 14:07:34.297940 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c0515fd_b685_4dab_909a_3f4147e19a59.slice/crio-6d9fbbeb32adb5c0d9085824c040bbd87fea2cc6bf76a1e7d360b0170dcf283c WatchSource:0}: Error finding container 6d9fbbeb32adb5c0d9085824c040bbd87fea2cc6bf76a1e7d360b0170dcf283c: Status 404 returned error can't find the container with id 6d9fbbeb32adb5c0d9085824c040bbd87fea2cc6bf76a1e7d360b0170dcf283c Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.298791 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.298880 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13a3ac18-a408-4c88-8dfc-d04b509941d1" (UID: "13a3ac18-a408-4c88-8dfc-d04b509941d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.300047 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.305666 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nr94d" event={"ID":"ace7c5ce-f1fc-4600-9764-0e357b80f849","Type":"ContainerDied","Data":"83ce9642711637f71d75b22c1855ec531ee7df719fc0b5f5fbc23443a5e9b992"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.305702 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83ce9642711637f71d75b22c1855ec531ee7df719fc0b5f5fbc23443a5e9b992" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.305753 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nr94d" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.349194 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerStarted","Data":"9be983c4db76de927aa30a2f56f9672d6fee758cddafdfe7201d22193ac2c90a"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.358547 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b9b5fc8fb-nlcpz" event={"ID":"ba9ae197-8325-4a07-a174-31f7f2e29978","Type":"ContainerDied","Data":"9aaba01f0eb82ab28a90d2532ad6a660ee07e81b9aea35e9408c28c258ee2103"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.358687 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b9b5fc8fb-nlcpz" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.361044 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ba9ae197-8325-4a07-a174-31f7f2e29978" (UID: "ba9ae197-8325-4a07-a174-31f7f2e29978"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.365725 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data" (OuterVolumeSpecName: "config-data") pod "4731e204-78a9-4d3c-8763-ace5d7d97cf7" (UID: "4731e204-78a9-4d3c-8763-ace5d7d97cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.367476 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4731e204-78a9-4d3c-8763-ace5d7d97cf7","Type":"ContainerDied","Data":"dd7442cde78646287e4f8b30a044e3446fd930c76c7f39ef9de1524abda013e8"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.367600 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.385880 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.385924 4902 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9ae197-8325-4a07-a174-31f7f2e29978-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.385937 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.385949 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4731e204-78a9-4d3c-8763-ace5d7d97cf7-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.385960 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13a3ac18-a408-4c88-8dfc-d04b509941d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.389123 4902 generic.go:334] "Generic (PLEG): container finished" podID="9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" containerID="88e6cbbc2443b0e5c8d486696e9f4c9d25b6b1ac059f60994a39dd1719be60d8" exitCode=0 Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.389185 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zjfzf" event={"ID":"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a","Type":"ContainerDied","Data":"88e6cbbc2443b0e5c8d486696e9f4c9d25b6b1ac059f60994a39dd1719be60d8"} Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.396360 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/2.log" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.530279 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.537211 4902 scope.go:117] "RemoveContainer" containerID="26401dfab58bf8d5306f89cdd551a4189699bc71c0c67bffa6d8975b03e98b74" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.541661 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.551975 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552454 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-log" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552475 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-log" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552498 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-httpd" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552506 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-httpd" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552516 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80011324-6959-416f-a2e0-1ea80e7b32fd" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552524 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="80011324-6959-416f-a2e0-1ea80e7b32fd" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552546 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace7c5ce-f1fc-4600-9764-0e357b80f849" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552553 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace7c5ce-f1fc-4600-9764-0e357b80f849" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552575 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon-log" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552582 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon-log" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552596 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a3ac18-a408-4c88-8dfc-d04b509941d1" containerName="init" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552603 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a3ac18-a408-4c88-8dfc-d04b509941d1" containerName="init" Oct 09 14:07:34 crc kubenswrapper[4902]: E1009 14:07:34.552617 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552625 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552822 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a3ac18-a408-4c88-8dfc-d04b509941d1" containerName="init" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552836 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon-log" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552851 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-log" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552866 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="80011324-6959-416f-a2e0-1ea80e7b32fd" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552884 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace7c5ce-f1fc-4600-9764-0e357b80f849" containerName="mariadb-database-create" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552899 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" containerName="glance-httpd" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.552911 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" containerName="horizon" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.554079 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.560196 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.560327 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.575253 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.720388 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.725957 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-logs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.726194 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.726839 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/df90d442-7261-4353-821c-c0e71a43998a-kube-api-access-8xb88\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.726925 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.726973 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.727089 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.727124 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.767907 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.808476 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-gjtkj"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.821681 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.828723 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.829864 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/df90d442-7261-4353-821c-c0e71a43998a-kube-api-access-8xb88\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.829939 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.829980 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.830022 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.830063 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.830112 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.830141 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-logs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.830685 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-logs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.831528 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.831902 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df90d442-7261-4353-821c-c0e71a43998a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.833067 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.834976 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.842021 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.847026 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b9b5fc8fb-nlcpz"] Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.851269 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.852102 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d442-7261-4353-821c-c0e71a43998a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.852686 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xb88\" (UniqueName: \"kubernetes.io/projected/df90d442-7261-4353-821c-c0e71a43998a-kube-api-access-8xb88\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.897130 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"df90d442-7261-4353-821c-c0e71a43998a\") " pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.924599 4902 scope.go:117] "RemoveContainer" containerID="c56bd3765dfec2fab257f46ddb8a13f723af38fda399398302589b4e0d3a8b34" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.940735 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:34 crc kubenswrapper[4902]: I1009 14:07:34.982650 4902 scope.go:117] "RemoveContainer" containerID="dcc8d525cfd7ea074939c35aeeffd6b00b2a53c9c63c970bd438d28a81ac2e40" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.041575 4902 scope.go:117] "RemoveContainer" containerID="b130d1ec682a3095108f2f648767fe70d179728d61a4c2ba1e20093d4ba09cdf" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.436260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c0515fd-b685-4dab-909a-3f4147e19a59","Type":"ContainerStarted","Data":"6d9fbbeb32adb5c0d9085824c040bbd87fea2cc6bf76a1e7d360b0170dcf283c"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.468442 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" event={"ID":"abbbf9c3-be8f-4199-9f68-626d18441730","Type":"ContainerStarted","Data":"86c8206a3f8db76cd10e719402b512a670e6bb9d936d05ad3295efba0946ae66"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.469315 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.477945 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" event={"ID":"11b3f7c7-66a5-485c-922e-b5568e2f9f1c","Type":"ContainerStarted","Data":"26d2dfaf782a4e7ec63533a2061772ba66d0a2617172cb3e733fb3e9b3c09246"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500240 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerStarted","Data":"805fe946ab82c4934f514c6c13a35823ffe8bc4eab416a192f43445682cea526"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500432 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500450 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="sg-core" containerID="cri-o://391c2456b125ea67f4ca26def719bb38bd937cdbbdeb68f166ae1932db02db95" gracePeriod=30 Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500479 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-notification-agent" containerID="cri-o://e7f24a4e051a495560a0dd23e83b7fb1fd844bfdb6b9c51f6dde90c1ae50d2e4" gracePeriod=30 Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500521 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="proxy-httpd" containerID="cri-o://805fe946ab82c4934f514c6c13a35823ffe8bc4eab416a192f43445682cea526" gracePeriod=30 Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.500431 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-central-agent" containerID="cri-o://f2f5c2e8f29e84af1b4fb2b3129fa51f289f4e3d68c5af374be43fbcabf6c96f" gracePeriod=30 Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.503856 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerStarted","Data":"759c9ee1e98acacc7c1d73448e238e0014a2538c6c0ba993b6a4c8e58b918291"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.505318 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-896cb696f-kkg85" event={"ID":"fe538ee2-2e8c-406f-8e70-bc56325ec408","Type":"ContainerStarted","Data":"2d584b961e52ad58727a0f219f1da820b5c1fa83fa2e30423919b6d74d2fcbdd"} Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.523581 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" podStartSLOduration=5.523558665 podStartE2EDuration="5.523558665s" podCreationTimestamp="2025-10-09 14:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:35.51092222 +0000 UTC m=+1002.708781294" watchObservedRunningTime="2025-10-09 14:07:35.523558665 +0000 UTC m=+1002.721417729" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.571300 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a3ac18-a408-4c88-8dfc-d04b509941d1" path="/var/lib/kubelet/pods/13a3ac18-a408-4c88-8dfc-d04b509941d1/volumes" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.573053 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4731e204-78a9-4d3c-8763-ace5d7d97cf7" path="/var/lib/kubelet/pods/4731e204-78a9-4d3c-8763-ace5d7d97cf7/volumes" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.574168 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9ae197-8325-4a07-a174-31f7f2e29978" path="/var/lib/kubelet/pods/ba9ae197-8325-4a07-a174-31f7f2e29978/volumes" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.575755 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54f57498cd-cv95r"] Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.577956 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.587777 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.588022 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.607888 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.692158955 podStartE2EDuration="10.607868161s" podCreationTimestamp="2025-10-09 14:07:25 +0000 UTC" firstStartedPulling="2025-10-09 14:07:26.806883466 +0000 UTC m=+994.004742530" lastFinishedPulling="2025-10-09 14:07:33.722592672 +0000 UTC m=+1000.920451736" observedRunningTime="2025-10-09 14:07:35.553169795 +0000 UTC m=+1002.751028859" watchObservedRunningTime="2025-10-09 14:07:35.607868161 +0000 UTC m=+1002.805727225" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.625776 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54f57498cd-cv95r"] Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659560 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-public-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659644 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e22729c-3eef-405e-bf5a-5654f9795d57-logs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659697 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-combined-ca-bundle\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659728 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data-custom\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659831 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659853 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-internal-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.659876 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5bjt\" (UniqueName: \"kubernetes.io/projected/8e22729c-3eef-405e-bf5a-5654f9795d57-kube-api-access-x5bjt\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.762450 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-public-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.762608 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e22729c-3eef-405e-bf5a-5654f9795d57-logs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.764867 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e22729c-3eef-405e-bf5a-5654f9795d57-logs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.766038 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-combined-ca-bundle\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.766078 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data-custom\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.766773 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.766848 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-internal-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.766998 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5bjt\" (UniqueName: \"kubernetes.io/projected/8e22729c-3eef-405e-bf5a-5654f9795d57-kube-api-access-x5bjt\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.772664 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data-custom\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.773286 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-public-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.773618 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-config-data\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.775802 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-combined-ca-bundle\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.780221 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e22729c-3eef-405e-bf5a-5654f9795d57-internal-tls-certs\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.793699 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5bjt\" (UniqueName: \"kubernetes.io/projected/8e22729c-3eef-405e-bf5a-5654f9795d57-kube-api-access-x5bjt\") pod \"barbican-api-54f57498cd-cv95r\" (UID: \"8e22729c-3eef-405e-bf5a-5654f9795d57\") " pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.849923 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 14:07:35 crc kubenswrapper[4902]: I1009 14:07:35.928894 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.148683 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.279570 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wls5\" (UniqueName: \"kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5\") pod \"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a\" (UID: \"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a\") " Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.286003 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5" (OuterVolumeSpecName: "kube-api-access-8wls5") pod "9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" (UID: "9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a"). InnerVolumeSpecName "kube-api-access-8wls5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.383947 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wls5\" (UniqueName: \"kubernetes.io/projected/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a-kube-api-access-8wls5\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.576931 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df90d442-7261-4353-821c-c0e71a43998a","Type":"ContainerStarted","Data":"26704488bae752f8ba62c0b8108f6bb5ea0f6b6c967b2a1322d8f8489dd8e102"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651557 4902 generic.go:334] "Generic (PLEG): container finished" podID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerID="805fe946ab82c4934f514c6c13a35823ffe8bc4eab416a192f43445682cea526" exitCode=0 Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651596 4902 generic.go:334] "Generic (PLEG): container finished" podID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerID="391c2456b125ea67f4ca26def719bb38bd937cdbbdeb68f166ae1932db02db95" exitCode=2 Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651607 4902 generic.go:334] "Generic (PLEG): container finished" podID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerID="e7f24a4e051a495560a0dd23e83b7fb1fd844bfdb6b9c51f6dde90c1ae50d2e4" exitCode=0 Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651621 4902 generic.go:334] "Generic (PLEG): container finished" podID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerID="f2f5c2e8f29e84af1b4fb2b3129fa51f289f4e3d68c5af374be43fbcabf6c96f" exitCode=0 Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651712 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerDied","Data":"805fe946ab82c4934f514c6c13a35823ffe8bc4eab416a192f43445682cea526"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651736 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerDied","Data":"391c2456b125ea67f4ca26def719bb38bd937cdbbdeb68f166ae1932db02db95"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651759 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerDied","Data":"e7f24a4e051a495560a0dd23e83b7fb1fd844bfdb6b9c51f6dde90c1ae50d2e4"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.651769 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerDied","Data":"f2f5c2e8f29e84af1b4fb2b3129fa51f289f4e3d68c5af374be43fbcabf6c96f"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.671659 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-896cb696f-kkg85" event={"ID":"fe538ee2-2e8c-406f-8e70-bc56325ec408","Type":"ContainerStarted","Data":"bc1be40fa858700b8b0537d3d779a247a27429df65822faeae4cc3afabe465f0"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.679924 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zjfzf" event={"ID":"9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a","Type":"ContainerDied","Data":"053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.679975 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053aa2ad9bd107d223bda030a0b78e1fc5120853f3111451522baab068663073" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.680082 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zjfzf" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.701259 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c0515fd-b685-4dab-909a-3f4147e19a59","Type":"ContainerStarted","Data":"8e6ffa025380779ddda4eae9961f1a35dbb2af27b88724312f8a3ea82a7a064e"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.706076 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54f57498cd-cv95r"] Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.722305 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-896cb696f-kkg85" podStartSLOduration=4.108952318 podStartE2EDuration="8.722279416s" podCreationTimestamp="2025-10-09 14:07:28 +0000 UTC" firstStartedPulling="2025-10-09 14:07:28.970593772 +0000 UTC m=+996.168452826" lastFinishedPulling="2025-10-09 14:07:33.58392086 +0000 UTC m=+1000.781779924" observedRunningTime="2025-10-09 14:07:36.720545634 +0000 UTC m=+1003.918404738" watchObservedRunningTime="2025-10-09 14:07:36.722279416 +0000 UTC m=+1003.920138480" Oct 09 14:07:36 crc kubenswrapper[4902]: W1009 14:07:36.767242 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e22729c_3eef_405e_bf5a_5654f9795d57.slice/crio-b86bb7c11c15ac039189175f8efb8cf46dd851b8204991cfc2c2e4f179dabc68 WatchSource:0}: Error finding container b86bb7c11c15ac039189175f8efb8cf46dd851b8204991cfc2c2e4f179dabc68: Status 404 returned error can't find the container with id b86bb7c11c15ac039189175f8efb8cf46dd851b8204991cfc2c2e4f179dabc68 Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.773483 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" event={"ID":"11b3f7c7-66a5-485c-922e-b5568e2f9f1c","Type":"ContainerStarted","Data":"6c422bc7f2f7502fb51049898230b157309b9689e49c948572cd1461a53f2641"} Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.811674 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7dbd9b6574-b5dht" podStartSLOduration=4.490300674 podStartE2EDuration="8.811655873s" podCreationTimestamp="2025-10-09 14:07:28 +0000 UTC" firstStartedPulling="2025-10-09 14:07:29.263129218 +0000 UTC m=+996.460988282" lastFinishedPulling="2025-10-09 14:07:33.584484407 +0000 UTC m=+1000.782343481" observedRunningTime="2025-10-09 14:07:36.796334107 +0000 UTC m=+1003.994193171" watchObservedRunningTime="2025-10-09 14:07:36.811655873 +0000 UTC m=+1004.009514937" Oct 09 14:07:36 crc kubenswrapper[4902]: I1009 14:07:36.999580 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.106851 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.106975 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107010 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107093 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107151 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107175 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnv2b\" (UniqueName: \"kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107213 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts\") pod \"e5dd917a-e7aa-4092-906a-321ba2f744d8\" (UID: \"e5dd917a-e7aa-4092-906a-321ba2f744d8\") " Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.107944 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.108265 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.141782 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b" (OuterVolumeSpecName: "kube-api-access-wnv2b") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "kube-api-access-wnv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.141935 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts" (OuterVolumeSpecName: "scripts") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.187651 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.211601 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.211629 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnv2b\" (UniqueName: \"kubernetes.io/projected/e5dd917a-e7aa-4092-906a-321ba2f744d8-kube-api-access-wnv2b\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.211641 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.211650 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.211659 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5dd917a-e7aa-4092-906a-321ba2f744d8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.307805 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data" (OuterVolumeSpecName: "config-data") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.318978 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.332552 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5dd917a-e7aa-4092-906a-321ba2f744d8" (UID: "e5dd917a-e7aa-4092-906a-321ba2f744d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.421386 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dd917a-e7aa-4092-906a-321ba2f744d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.829710 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9c0515fd-b685-4dab-909a-3f4147e19a59","Type":"ContainerStarted","Data":"7915d4f41ff99e0fa50dfef52322abebdcb1c2bd92e4a24e9852c1ab599238d0"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.864319 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.864296642 podStartE2EDuration="5.864296642s" podCreationTimestamp="2025-10-09 14:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:37.851864723 +0000 UTC m=+1005.049723777" watchObservedRunningTime="2025-10-09 14:07:37.864296642 +0000 UTC m=+1005.062155706" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.868225 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54f57498cd-cv95r" event={"ID":"8e22729c-3eef-405e-bf5a-5654f9795d57","Type":"ContainerStarted","Data":"94fb21534a0a2b0f4fc5d4f90b5fce498c037ca189fc88de335e7aff1a897dbe"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.868281 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54f57498cd-cv95r" event={"ID":"8e22729c-3eef-405e-bf5a-5654f9795d57","Type":"ContainerStarted","Data":"2f96c7086ee2f3a8230e21c914250b65cd558c926ee63f8c6a0a9262390483d1"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.868297 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54f57498cd-cv95r" event={"ID":"8e22729c-3eef-405e-bf5a-5654f9795d57","Type":"ContainerStarted","Data":"b86bb7c11c15ac039189175f8efb8cf46dd851b8204991cfc2c2e4f179dabc68"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.868845 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.868899 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.885194 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerStarted","Data":"0bbbb0c9fd9dc468b5570c9ed70ada5c8e439f9c046fc62ddedc6b9c97bbd6da"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.885260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerStarted","Data":"7800ce09e095035c21f98eeca35d2ef33b8dcac7653772319829401c7500f85b"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.893437 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df90d442-7261-4353-821c-c0e71a43998a","Type":"ContainerStarted","Data":"d739f15159722c942c245da54fc4ad9aa8b4d1ebba3e00ecedafd9a244c5d18a"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.907857 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54f57498cd-cv95r" podStartSLOduration=2.9078342360000002 podStartE2EDuration="2.907834236s" podCreationTimestamp="2025-10-09 14:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:37.891998756 +0000 UTC m=+1005.089857820" watchObservedRunningTime="2025-10-09 14:07:37.907834236 +0000 UTC m=+1005.105693300" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.929563 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.186929013 podStartE2EDuration="7.929541152s" podCreationTimestamp="2025-10-09 14:07:30 +0000 UTC" firstStartedPulling="2025-10-09 14:07:33.279227053 +0000 UTC m=+1000.477086117" lastFinishedPulling="2025-10-09 14:07:35.021839192 +0000 UTC m=+1002.219698256" observedRunningTime="2025-10-09 14:07:37.919625647 +0000 UTC m=+1005.117484721" watchObservedRunningTime="2025-10-09 14:07:37.929541152 +0000 UTC m=+1005.127400216" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.946015 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e5dd917a-e7aa-4092-906a-321ba2f744d8","Type":"ContainerDied","Data":"b856f9246eaeef6d60fc2d32d67f1be48430848cd1d7f41539f998631cd30752"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.946089 4902 scope.go:117] "RemoveContainer" containerID="805fe946ab82c4934f514c6c13a35823ffe8bc4eab416a192f43445682cea526" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.946306 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.977917 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.983897 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api-log" containerID="cri-o://759c9ee1e98acacc7c1d73448e238e0014a2538c6c0ba993b6a4c8e58b918291" gracePeriod=30 Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.984323 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerStarted","Data":"1df73bdb5e8748d89362fbe367c5704ba81921b8a7fd89963381567000dd33f9"} Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.984985 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.985292 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api" containerID="cri-o://1df73bdb5e8748d89362fbe367c5704ba81921b8a7fd89963381567000dd33f9" gracePeriod=30 Oct 09 14:07:37 crc kubenswrapper[4902]: I1009 14:07:37.989230 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.015606 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.016131 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" containerName="mariadb-database-create" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016155 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" containerName="mariadb-database-create" Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.016181 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-central-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016192 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-central-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.016214 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="proxy-httpd" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016222 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="proxy-httpd" Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.016244 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="sg-core" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016253 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="sg-core" Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.016269 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-notification-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016277 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-notification-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016548 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-central-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016570 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" containerName="mariadb-database-create" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016581 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="proxy-httpd" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016605 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="ceilometer-notification-agent" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.016624 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" containerName="sg-core" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.027229 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.033048 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.034251 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.036620 4902 scope.go:117] "RemoveContainer" containerID="391c2456b125ea67f4ca26def719bb38bd937cdbbdeb68f166ae1932db02db95" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.041301 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.041273793 podStartE2EDuration="8.041273793s" podCreationTimestamp="2025-10-09 14:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:38.012351213 +0000 UTC m=+1005.210210287" watchObservedRunningTime="2025-10-09 14:07:38.041273793 +0000 UTC m=+1005.239132877" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.067690 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.095623 4902 scope.go:117] "RemoveContainer" containerID="e7f24a4e051a495560a0dd23e83b7fb1fd844bfdb6b9c51f6dde90c1ae50d2e4" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.102606 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.103502 4902 scope.go:117] "RemoveContainer" containerID="c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e" Oct 09 14:07:38 crc kubenswrapper[4902]: E1009 14:07:38.103744 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-5cf9b8867d-9zfjl_openstack(f5eb5ddd-7b3d-4392-9555-44eaf6e54c51)\"" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.104771 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.119309 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.144716 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.144759 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.144844 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpqwf\" (UniqueName: \"kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.144881 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.145001 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.145055 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.145089 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.172621 4902 scope.go:117] "RemoveContainer" containerID="f2f5c2e8f29e84af1b4fb2b3129fa51f289f4e3d68c5af374be43fbcabf6c96f" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250191 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250283 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250313 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250378 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250401 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250489 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpqwf\" (UniqueName: \"kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.250527 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.251813 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.252030 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.263243 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.263384 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.268384 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.275921 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.289096 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpqwf\" (UniqueName: \"kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf\") pod \"ceilometer-0\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.384817 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:38 crc kubenswrapper[4902]: I1009 14:07:38.973209 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:38 crc kubenswrapper[4902]: W1009 14:07:38.980374 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a579122_9bea_47ee_aeb5_8e08177b4f70.slice/crio-9e9e224dbd8df3257939dfb37599673cfa42a6781a1f4d513fce45fda81c8e58 WatchSource:0}: Error finding container 9e9e224dbd8df3257939dfb37599673cfa42a6781a1f4d513fce45fda81c8e58: Status 404 returned error can't find the container with id 9e9e224dbd8df3257939dfb37599673cfa42a6781a1f4d513fce45fda81c8e58 Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.059299 4902 generic.go:334] "Generic (PLEG): container finished" podID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerID="1df73bdb5e8748d89362fbe367c5704ba81921b8a7fd89963381567000dd33f9" exitCode=0 Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.059332 4902 generic.go:334] "Generic (PLEG): container finished" podID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerID="759c9ee1e98acacc7c1d73448e238e0014a2538c6c0ba993b6a4c8e58b918291" exitCode=143 Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.059400 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerDied","Data":"1df73bdb5e8748d89362fbe367c5704ba81921b8a7fd89963381567000dd33f9"} Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.059439 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerDied","Data":"759c9ee1e98acacc7c1d73448e238e0014a2538c6c0ba993b6a4c8e58b918291"} Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.061952 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.073567 4902 scope.go:117] "RemoveContainer" containerID="c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e" Oct 09 14:07:39 crc kubenswrapper[4902]: E1009 14:07:39.073823 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-5cf9b8867d-9zfjl_openstack(f5eb5ddd-7b3d-4392-9555-44eaf6e54c51)\"" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.075889 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df90d442-7261-4353-821c-c0e71a43998a","Type":"ContainerStarted","Data":"59c5622c333cecda6486d27003340635602dedd133719d449b4b3de610da55a5"} Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.138223 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.138198609 podStartE2EDuration="5.138198609s" podCreationTimestamp="2025-10-09 14:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:39.108005031 +0000 UTC m=+1006.305864105" watchObservedRunningTime="2025-10-09 14:07:39.138198609 +0000 UTC m=+1006.336057673" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.266795 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267175 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267225 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czstb\" (UniqueName: \"kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267261 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267309 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267378 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267445 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts\") pod \"e4af9da4-ae7e-4c75-b670-fa5246095383\" (UID: \"e4af9da4-ae7e-4c75-b670-fa5246095383\") " Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267849 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.267940 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs" (OuterVolumeSpecName: "logs") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.268552 4902 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4af9da4-ae7e-4c75-b670-fa5246095383-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.268570 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4af9da4-ae7e-4c75-b670-fa5246095383-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.275497 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts" (OuterVolumeSpecName: "scripts") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.275977 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb" (OuterVolumeSpecName: "kube-api-access-czstb") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "kube-api-access-czstb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.278706 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.315607 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.368155 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data" (OuterVolumeSpecName: "config-data") pod "e4af9da4-ae7e-4c75-b670-fa5246095383" (UID: "e4af9da4-ae7e-4c75-b670-fa5246095383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.371641 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.371680 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.371689 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.371697 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czstb\" (UniqueName: \"kubernetes.io/projected/e4af9da4-ae7e-4c75-b670-fa5246095383-kube-api-access-czstb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.371709 4902 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4af9da4-ae7e-4c75-b670-fa5246095383-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.525977 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5dd917a-e7aa-4092-906a-321ba2f744d8" path="/var/lib/kubelet/pods/e5dd917a-e7aa-4092-906a-321ba2f744d8/volumes" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.566465 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.840126 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-65878bc9b7-hv97v" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.911009 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-864577bbd8-z8v7t" Oct 09 14:07:39 crc kubenswrapper[4902]: I1009 14:07:39.928902 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.101462 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4af9da4-ae7e-4c75-b670-fa5246095383","Type":"ContainerDied","Data":"81bafe6d7767acd9273cc7a00737bd0d96e423efbceb24222339d84f7d9724d2"} Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.101515 4902 scope.go:117] "RemoveContainer" containerID="1df73bdb5e8748d89362fbe367c5704ba81921b8a7fd89963381567000dd33f9" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.101642 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.117208 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b741-account-create-22xhv"] Oct 09 14:07:40 crc kubenswrapper[4902]: E1009 14:07:40.117937 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.117974 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api" Oct 09 14:07:40 crc kubenswrapper[4902]: E1009 14:07:40.117999 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api-log" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.118004 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api-log" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.118166 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api-log" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.118184 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" containerName="cinder-api" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.119268 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cf9b8867d-9zfjl" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-api" containerID="cri-o://764530cdc309d34f5e4d9f50471c07cfdf1e04fc5e535bcdd92eedabca38c008" gracePeriod=30 Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.120201 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerStarted","Data":"28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36"} Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.120228 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerStarted","Data":"9e9e224dbd8df3257939dfb37599673cfa42a6781a1f4d513fce45fda81c8e58"} Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.120290 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.122245 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.132330 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b741-account-create-22xhv"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.168941 4902 scope.go:117] "RemoveContainer" containerID="759c9ee1e98acacc7c1d73448e238e0014a2538c6c0ba993b6a4c8e58b918291" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.188185 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.222120 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.270769 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.272639 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.274834 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.275543 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.275604 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.288876 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.293499 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25zkz\" (UniqueName: \"kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz\") pod \"nova-api-b741-account-create-22xhv\" (UID: \"21db4847-19e9-4e2e-b745-319609bc39e7\") " pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.343926 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b8a5-account-create-6m6lx"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.346329 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.353233 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.368611 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b8a5-account-create-6m6lx"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399309 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf8vl\" (UniqueName: \"kubernetes.io/projected/d7c2affc-5952-43d0-8629-8e61961bdf1c-kube-api-access-tf8vl\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399470 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399670 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399712 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25zkz\" (UniqueName: \"kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz\") pod \"nova-api-b741-account-create-22xhv\" (UID: \"21db4847-19e9-4e2e-b745-319609bc39e7\") " pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399764 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399821 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-scripts\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.399869 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.400028 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c2affc-5952-43d0-8629-8e61961bdf1c-logs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.400105 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.400178 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c2affc-5952-43d0-8629-8e61961bdf1c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.438306 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25zkz\" (UniqueName: \"kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz\") pod \"nova-api-b741-account-create-22xhv\" (UID: \"21db4847-19e9-4e2e-b745-319609bc39e7\") " pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.447082 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502528 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502584 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c2affc-5952-43d0-8629-8e61961bdf1c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502638 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf8vl\" (UniqueName: \"kubernetes.io/projected/d7c2affc-5952-43d0-8629-8e61961bdf1c-kube-api-access-tf8vl\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502667 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502705 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9z8\" (UniqueName: \"kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8\") pod \"nova-cell0-b8a5-account-create-6m6lx\" (UID: \"d22a652a-afff-4bc3-ad24-30d9abfa577f\") " pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502750 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502780 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502803 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-scripts\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502827 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.502862 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c2affc-5952-43d0-8629-8e61961bdf1c-logs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.503251 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7c2affc-5952-43d0-8629-8e61961bdf1c-logs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.503681 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c2affc-5952-43d0-8629-8e61961bdf1c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.511865 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.512111 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.521065 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.521327 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.521447 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.522398 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf8vl\" (UniqueName: \"kubernetes.io/projected/d7c2affc-5952-43d0-8629-8e61961bdf1c-kube-api-access-tf8vl\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.523144 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c2affc-5952-43d0-8629-8e61961bdf1c-scripts\") pod \"cinder-api-0\" (UID: \"d7c2affc-5952-43d0-8629-8e61961bdf1c\") " pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.604400 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9z8\" (UniqueName: \"kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8\") pod \"nova-cell0-b8a5-account-create-6m6lx\" (UID: \"d22a652a-afff-4bc3-ad24-30d9abfa577f\") " pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.628897 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9z8\" (UniqueName: \"kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8\") pod \"nova-cell0-b8a5-account-create-6m6lx\" (UID: \"d22a652a-afff-4bc3-ad24-30d9abfa577f\") " pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.652220 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.670354 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.701866 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.754562 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.855247 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:40 crc kubenswrapper[4902]: I1009 14:07:40.855543 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="dnsmasq-dns" containerID="cri-o://8a9b53dbfc92bf8b04978fedbbd6146c3cf26aca74cf46f5b8ff35200d034125" gracePeriod=10 Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.069333 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b741-account-create-22xhv"] Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.195703 4902 generic.go:334] "Generic (PLEG): container finished" podID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerID="8a9b53dbfc92bf8b04978fedbbd6146c3cf26aca74cf46f5b8ff35200d034125" exitCode=0 Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.195828 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" event={"ID":"f101e38c-9520-43d9-b911-8fc2fdc4459c","Type":"ContainerDied","Data":"8a9b53dbfc92bf8b04978fedbbd6146c3cf26aca74cf46f5b8ff35200d034125"} Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.237320 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b741-account-create-22xhv" event={"ID":"21db4847-19e9-4e2e-b745-319609bc39e7","Type":"ContainerStarted","Data":"56a4923933e1885a1802946f878acc67a353b9e6682ee3f15dcf847739239505"} Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.339972 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b8a5-account-create-6m6lx"] Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.542244 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4af9da4-ae7e-4c75-b670-fa5246095383" path="/var/lib/kubelet/pods/e4af9da4-ae7e-4c75-b670-fa5246095383/volumes" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.543166 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.645457 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.738739 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.738872 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.739029 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9xm2\" (UniqueName: \"kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.739098 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.739251 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.739352 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc\") pod \"f101e38c-9520-43d9-b911-8fc2fdc4459c\" (UID: \"f101e38c-9520-43d9-b911-8fc2fdc4459c\") " Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.761723 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2" (OuterVolumeSpecName: "kube-api-access-l9xm2") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "kube-api-access-l9xm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.845614 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9xm2\" (UniqueName: \"kubernetes.io/projected/f101e38c-9520-43d9-b911-8fc2fdc4459c-kube-api-access-l9xm2\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.922776 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.972249 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.986124 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:41 crc kubenswrapper[4902]: I1009 14:07:41.991790 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.005733 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.010964 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config" (OuterVolumeSpecName: "config") pod "f101e38c-9520-43d9-b911-8fc2fdc4459c" (UID: "f101e38c-9520-43d9-b911-8fc2fdc4459c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.052072 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.052104 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.052114 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.052122 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.052130 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f101e38c-9520-43d9-b911-8fc2fdc4459c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.216718 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.257091 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" event={"ID":"f101e38c-9520-43d9-b911-8fc2fdc4459c","Type":"ContainerDied","Data":"01ca2bb7a1e4b549674f1cfd52dc4250d12f4e37373a5dc39dfb7d571e710952"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.257152 4902 scope.go:117] "RemoveContainer" containerID="8a9b53dbfc92bf8b04978fedbbd6146c3cf26aca74cf46f5b8ff35200d034125" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.257311 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-d6xgh" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.306563 4902 generic.go:334] "Generic (PLEG): container finished" podID="d22a652a-afff-4bc3-ad24-30d9abfa577f" containerID="3912a83ba98f7248edcd6a5b32482a4371d739c557735a8b3836099aaec0d311" exitCode=0 Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.306644 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" event={"ID":"d22a652a-afff-4bc3-ad24-30d9abfa577f","Type":"ContainerDied","Data":"3912a83ba98f7248edcd6a5b32482a4371d739c557735a8b3836099aaec0d311"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.306672 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" event={"ID":"d22a652a-afff-4bc3-ad24-30d9abfa577f","Type":"ContainerStarted","Data":"a5883cdd9f8da03b02ccdf5a410d25a39891a4a88f07574e2d91cee6fc10f1b5"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.324589 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d7c2affc-5952-43d0-8629-8e61961bdf1c","Type":"ContainerStarted","Data":"3255c7b0b28ba30dc1f3e34efea3eb4b3ccb376e92f3741187d0322df71735c5"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.326992 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerStarted","Data":"7d306b7d11b30a0734f0bd03c16c987b6e56780bb4fe088be3d12ee5fc69dbf5"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.356697 4902 generic.go:334] "Generic (PLEG): container finished" podID="21db4847-19e9-4e2e-b745-319609bc39e7" containerID="18cb5b774c2f10daf84263ccd477516848d97f22989e76a725ce94bca73ec75c" exitCode=0 Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.357328 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b741-account-create-22xhv" event={"ID":"21db4847-19e9-4e2e-b745-319609bc39e7","Type":"ContainerDied","Data":"18cb5b774c2f10daf84263ccd477516848d97f22989e76a725ce94bca73ec75c"} Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.437499 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.442317 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-d6xgh"] Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.442656 4902 scope.go:117] "RemoveContainer" containerID="13f0365bf9d78b914e6a2543e3bddb74578658f11dafdb9513cdc2d283ccc4da" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.611660 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.611745 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.648550 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 14:07:42 crc kubenswrapper[4902]: I1009 14:07:42.664852 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.387351 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d7c2affc-5952-43d0-8629-8e61961bdf1c","Type":"ContainerStarted","Data":"88555b2a2bf9019b178c26f52adf840d2e147a3954574fbf8acbe58c8ed470c2"} Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.387781 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d7c2affc-5952-43d0-8629-8e61961bdf1c","Type":"ContainerStarted","Data":"04835324455d6241e48419019655aadfe4dd7c9133bee73f377c3080ec2cfd34"} Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.387924 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.402348 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerStarted","Data":"6ed8f0646869dbe8bf1004e112083930c488fbe20b3119a300c1bc66a8b53f0a"} Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.403868 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.403904 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.428145 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.428121954 podStartE2EDuration="3.428121954s" podCreationTimestamp="2025-10-09 14:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:43.416314113 +0000 UTC m=+1010.614173197" watchObservedRunningTime="2025-10-09 14:07:43.428121954 +0000 UTC m=+1010.625981018" Oct 09 14:07:43 crc kubenswrapper[4902]: I1009 14:07:43.535259 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" path="/var/lib/kubelet/pods/f101e38c-9520-43d9-b911-8fc2fdc4459c/volumes" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.007726 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.022919 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.112442 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25zkz\" (UniqueName: \"kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz\") pod \"21db4847-19e9-4e2e-b745-319609bc39e7\" (UID: \"21db4847-19e9-4e2e-b745-319609bc39e7\") " Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.112513 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9z8\" (UniqueName: \"kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8\") pod \"d22a652a-afff-4bc3-ad24-30d9abfa577f\" (UID: \"d22a652a-afff-4bc3-ad24-30d9abfa577f\") " Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.119218 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8" (OuterVolumeSpecName: "kube-api-access-4x9z8") pod "d22a652a-afff-4bc3-ad24-30d9abfa577f" (UID: "d22a652a-afff-4bc3-ad24-30d9abfa577f"). InnerVolumeSpecName "kube-api-access-4x9z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.120156 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz" (OuterVolumeSpecName: "kube-api-access-25zkz") pod "21db4847-19e9-4e2e-b745-319609bc39e7" (UID: "21db4847-19e9-4e2e-b745-319609bc39e7"). InnerVolumeSpecName "kube-api-access-25zkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.215188 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25zkz\" (UniqueName: \"kubernetes.io/projected/21db4847-19e9-4e2e-b745-319609bc39e7-kube-api-access-25zkz\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.215236 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x9z8\" (UniqueName: \"kubernetes.io/projected/d22a652a-afff-4bc3-ad24-30d9abfa577f-kube-api-access-4x9z8\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.411323 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b741-account-create-22xhv" event={"ID":"21db4847-19e9-4e2e-b745-319609bc39e7","Type":"ContainerDied","Data":"56a4923933e1885a1802946f878acc67a353b9e6682ee3f15dcf847739239505"} Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.411367 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56a4923933e1885a1802946f878acc67a353b9e6682ee3f15dcf847739239505" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.411462 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b741-account-create-22xhv" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.415457 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" event={"ID":"d22a652a-afff-4bc3-ad24-30d9abfa577f","Type":"ContainerDied","Data":"a5883cdd9f8da03b02ccdf5a410d25a39891a4a88f07574e2d91cee6fc10f1b5"} Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.415498 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5883cdd9f8da03b02ccdf5a410d25a39891a4a88f07574e2d91cee6fc10f1b5" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.415501 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b8a5-account-create-6m6lx" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.941319 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.941668 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.979027 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:44 crc kubenswrapper[4902]: I1009 14:07:44.995265 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.426680 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerStarted","Data":"795ea1101233e10ee50699498932c4afde2854ac0fcbcdc1d06658d4f00dda3b"} Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.426911 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.427077 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.427323 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.427603 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.498627 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.619453934 podStartE2EDuration="8.498599418s" podCreationTimestamp="2025-10-09 14:07:37 +0000 UTC" firstStartedPulling="2025-10-09 14:07:39.046603276 +0000 UTC m=+1006.244462350" lastFinishedPulling="2025-10-09 14:07:44.92574877 +0000 UTC m=+1012.123607834" observedRunningTime="2025-10-09 14:07:45.46300426 +0000 UTC m=+1012.660863344" watchObservedRunningTime="2025-10-09 14:07:45.498599418 +0000 UTC m=+1012.696458482" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.585322 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs8sb"] Oct 09 14:07:45 crc kubenswrapper[4902]: E1009 14:07:45.585751 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="init" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.585778 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="init" Oct 09 14:07:45 crc kubenswrapper[4902]: E1009 14:07:45.585801 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22a652a-afff-4bc3-ad24-30d9abfa577f" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.585811 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22a652a-afff-4bc3-ad24-30d9abfa577f" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: E1009 14:07:45.585859 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21db4847-19e9-4e2e-b745-319609bc39e7" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.585869 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="21db4847-19e9-4e2e-b745-319609bc39e7" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: E1009 14:07:45.585890 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="dnsmasq-dns" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.585898 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="dnsmasq-dns" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.586119 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="21db4847-19e9-4e2e-b745-319609bc39e7" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.586150 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f101e38c-9520-43d9-b911-8fc2fdc4459c" containerName="dnsmasq-dns" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.586178 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22a652a-afff-4bc3-ad24-30d9abfa577f" containerName="mariadb-account-create" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.587742 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs8sb"] Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.587855 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.590880 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.591182 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.591582 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qbb4g" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.753559 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.753696 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.753738 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zd26\" (UniqueName: \"kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.753776 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.855187 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.855320 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.855366 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zd26\" (UniqueName: \"kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.855404 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.861357 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.862119 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.863919 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.873361 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zd26\" (UniqueName: \"kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26\") pod \"nova-cell0-conductor-db-sync-bs8sb\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:45 crc kubenswrapper[4902]: I1009 14:07:45.915033 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.009351 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.052995 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.229003 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.230454 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.479987 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/2.log" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.482982 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerID="764530cdc309d34f5e4d9f50471c07cfdf1e04fc5e535bcdd92eedabca38c008" exitCode=0 Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.483176 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerDied","Data":"764530cdc309d34f5e4d9f50471c07cfdf1e04fc5e535bcdd92eedabca38c008"} Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.483226 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="cinder-scheduler" containerID="cri-o://7800ce09e095035c21f98eeca35d2ef33b8dcac7653772319829401c7500f85b" gracePeriod=30 Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.485064 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.485541 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="probe" containerID="cri-o://0bbbb0c9fd9dc468b5570c9ed70ada5c8e439f9c046fc62ddedc6b9c97bbd6da" gracePeriod=30 Oct 09 14:07:46 crc kubenswrapper[4902]: I1009 14:07:46.588549 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs8sb"] Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.148186 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/2.log" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.149095 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.195914 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs\") pod \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.196039 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgdlc\" (UniqueName: \"kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc\") pod \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.196123 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle\") pod \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.196195 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config\") pod \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.196296 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config\") pod \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\" (UID: \"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51\") " Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.205937 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" (UID: "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.220522 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc" (OuterVolumeSpecName: "kube-api-access-sgdlc") pod "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" (UID: "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51"). InnerVolumeSpecName "kube-api-access-sgdlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.284594 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config" (OuterVolumeSpecName: "config") pod "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" (UID: "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.301809 4902 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.301850 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.301863 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgdlc\" (UniqueName: \"kubernetes.io/projected/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-kube-api-access-sgdlc\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.328564 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" (UID: "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.354512 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" (UID: "f5eb5ddd-7b3d-4392-9555-44eaf6e54c51"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.403707 4902 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.404174 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.511340 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerID="0bbbb0c9fd9dc468b5570c9ed70ada5c8e439f9c046fc62ddedc6b9c97bbd6da" exitCode=0 Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.511453 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerDied","Data":"0bbbb0c9fd9dc468b5570c9ed70ada5c8e439f9c046fc62ddedc6b9c97bbd6da"} Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.513804 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5cf9b8867d-9zfjl_f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/neutron-httpd/2.log" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.517271 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cf9b8867d-9zfjl" event={"ID":"f5eb5ddd-7b3d-4392-9555-44eaf6e54c51","Type":"ContainerDied","Data":"2600a6398b27b5ba83ef9bff7c37f3dbd91d24aebd484f9148ebb3d217adecfb"} Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.517322 4902 scope.go:117] "RemoveContainer" containerID="c11aa6b7c854c6dfeded0c99e3ae32e4ebb1287b085e2864f0a59062d0214d9e" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.517539 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cf9b8867d-9zfjl" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.540382 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" event={"ID":"cbeddafe-9239-4c02-8f7d-f4a341ee97b1","Type":"ContainerStarted","Data":"035c59586aac2dd23ef4b765e7e5a0cc823f7c9b8e1fef87b6561389c5ab3c09"} Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.555205 4902 scope.go:117] "RemoveContainer" containerID="764530cdc309d34f5e4d9f50471c07cfdf1e04fc5e535bcdd92eedabca38c008" Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.568256 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:47 crc kubenswrapper[4902]: I1009 14:07:47.587164 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5cf9b8867d-9zfjl"] Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.316125 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.316504 4902 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.353524 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.422321 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.480516 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54f57498cd-cv95r" Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.590858 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.594041 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7479d98b6d-qm6k2" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api-log" containerID="cri-o://d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e" gracePeriod=30 Oct 09 14:07:48 crc kubenswrapper[4902]: I1009 14:07:48.595366 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7479d98b6d-qm6k2" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api" containerID="cri-o://da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34" gracePeriod=30 Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.537284 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" path="/var/lib/kubelet/pods/f5eb5ddd-7b3d-4392-9555-44eaf6e54c51/volumes" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.602560 4902 generic.go:334] "Generic (PLEG): container finished" podID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerID="7800ce09e095035c21f98eeca35d2ef33b8dcac7653772319829401c7500f85b" exitCode=0 Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.602629 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerDied","Data":"7800ce09e095035c21f98eeca35d2ef33b8dcac7653772319829401c7500f85b"} Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.606620 4902 generic.go:334] "Generic (PLEG): container finished" podID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerID="d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e" exitCode=143 Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.606671 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerDied","Data":"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e"} Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.836626 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.878775 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.878858 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.878980 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.879169 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvskq\" (UniqueName: \"kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.879243 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.879314 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle\") pod \"f5566f62-4d11-4503-ab45-7f2d727bc397\" (UID: \"f5566f62-4d11-4503-ab45-7f2d727bc397\") " Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.885252 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.885332 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.899599 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts" (OuterVolumeSpecName: "scripts") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.958642 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq" (OuterVolumeSpecName: "kube-api-access-fvskq") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "kube-api-access-fvskq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.984419 4902 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f5566f62-4d11-4503-ab45-7f2d727bc397-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.984462 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.984475 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvskq\" (UniqueName: \"kubernetes.io/projected/f5566f62-4d11-4503-ab45-7f2d727bc397-kube-api-access-fvskq\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:49 crc kubenswrapper[4902]: I1009 14:07:49.984488 4902 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.020215 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.070707 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data" (OuterVolumeSpecName: "config-data") pod "f5566f62-4d11-4503-ab45-7f2d727bc397" (UID: "f5566f62-4d11-4503-ab45-7f2d727bc397"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.085883 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.085929 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5566f62-4d11-4503-ab45-7f2d727bc397-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.301574 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b001-account-create-lmts2"] Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.301978 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="cinder-scheduler" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302001 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="cinder-scheduler" Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.302016 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="probe" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302023 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="probe" Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.302039 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-api" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302049 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-api" Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.302075 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302081 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.302095 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302102 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302273 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-api" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302286 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="cinder-scheduler" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302300 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302315 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302323 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302336 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" containerName="probe" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.302921 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.307008 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.321888 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b001-account-create-lmts2"] Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.391274 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjk4\" (UniqueName: \"kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4\") pod \"nova-cell1-b001-account-create-lmts2\" (UID: \"2e924c89-5324-4bc9-9e4d-9005d6d1257d\") " pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.493122 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjk4\" (UniqueName: \"kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4\") pod \"nova-cell1-b001-account-create-lmts2\" (UID: \"2e924c89-5324-4bc9-9e4d-9005d6d1257d\") " pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.513984 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjk4\" (UniqueName: \"kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4\") pod \"nova-cell1-b001-account-create-lmts2\" (UID: \"2e924c89-5324-4bc9-9e4d-9005d6d1257d\") " pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.621120 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.630848 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f5566f62-4d11-4503-ab45-7f2d727bc397","Type":"ContainerDied","Data":"9be983c4db76de927aa30a2f56f9672d6fee758cddafdfe7201d22193ac2c90a"} Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.630924 4902 scope.go:117] "RemoveContainer" containerID="0bbbb0c9fd9dc468b5570c9ed70ada5c8e439f9c046fc62ddedc6b9c97bbd6da" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.630942 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.675922 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.682324 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.700339 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:50 crc kubenswrapper[4902]: E1009 14:07:50.700819 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.700840 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5eb5ddd-7b3d-4392-9555-44eaf6e54c51" containerName="neutron-httpd" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.702174 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.706986 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.712869 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.761069 4902 scope.go:117] "RemoveContainer" containerID="7800ce09e095035c21f98eeca35d2ef33b8dcac7653772319829401c7500f85b" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.801852 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.802086 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.802112 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-scripts\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.802190 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.802232 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/676825a2-3e5b-4137-b9fc-337425ff8d09-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.802267 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5qh\" (UniqueName: \"kubernetes.io/projected/676825a2-3e5b-4137-b9fc-337425ff8d09-kube-api-access-kh5qh\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.906566 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.907244 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.907297 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-scripts\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.907360 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.907435 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/676825a2-3e5b-4137-b9fc-337425ff8d09-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.907471 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5qh\" (UniqueName: \"kubernetes.io/projected/676825a2-3e5b-4137-b9fc-337425ff8d09-kube-api-access-kh5qh\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.914185 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/676825a2-3e5b-4137-b9fc-337425ff8d09-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.923509 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.923919 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-config-data\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.927583 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.930691 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/676825a2-3e5b-4137-b9fc-337425ff8d09-scripts\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:50 crc kubenswrapper[4902]: I1009 14:07:50.942998 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5qh\" (UniqueName: \"kubernetes.io/projected/676825a2-3e5b-4137-b9fc-337425ff8d09-kube-api-access-kh5qh\") pod \"cinder-scheduler-0\" (UID: \"676825a2-3e5b-4137-b9fc-337425ff8d09\") " pod="openstack/cinder-scheduler-0" Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.136908 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.394324 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b001-account-create-lmts2"] Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.534313 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5566f62-4d11-4503-ab45-7f2d727bc397" path="/var/lib/kubelet/pods/f5566f62-4d11-4503-ab45-7f2d727bc397/volumes" Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.657985 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b001-account-create-lmts2" event={"ID":"2e924c89-5324-4bc9-9e4d-9005d6d1257d","Type":"ContainerStarted","Data":"76ad981355de2f265405831f975fa57068fa0c061c6e6ddcd21bc57cbe242e92"} Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.800021 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 14:07:51 crc kubenswrapper[4902]: W1009 14:07:51.820338 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676825a2_3e5b_4137_b9fc_337425ff8d09.slice/crio-945c72aadf0515ec335a90cd3b55e1a48a3e446bc77a7f20e09a75d1c78ddaad WatchSource:0}: Error finding container 945c72aadf0515ec335a90cd3b55e1a48a3e446bc77a7f20e09a75d1c78ddaad: Status 404 returned error can't find the container with id 945c72aadf0515ec335a90cd3b55e1a48a3e446bc77a7f20e09a75d1c78ddaad Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.825704 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7479d98b6d-qm6k2" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46442->10.217.0.166:9311: read: connection reset by peer" Oct 09 14:07:51 crc kubenswrapper[4902]: I1009 14:07:51.826330 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7479d98b6d-qm6k2" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:46430->10.217.0.166:9311: read: connection reset by peer" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.392580 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.448846 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw22p\" (UniqueName: \"kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p\") pod \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.448950 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs\") pod \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.449078 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom\") pod \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.449173 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data\") pod \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.449202 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle\") pod \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\" (UID: \"61e61af2-6a2e-4fe8-9e42-57d6303411f8\") " Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.450078 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs" (OuterVolumeSpecName: "logs") pod "61e61af2-6a2e-4fe8-9e42-57d6303411f8" (UID: "61e61af2-6a2e-4fe8-9e42-57d6303411f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.453689 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p" (OuterVolumeSpecName: "kube-api-access-hw22p") pod "61e61af2-6a2e-4fe8-9e42-57d6303411f8" (UID: "61e61af2-6a2e-4fe8-9e42-57d6303411f8"). InnerVolumeSpecName "kube-api-access-hw22p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.453915 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "61e61af2-6a2e-4fe8-9e42-57d6303411f8" (UID: "61e61af2-6a2e-4fe8-9e42-57d6303411f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.489843 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61e61af2-6a2e-4fe8-9e42-57d6303411f8" (UID: "61e61af2-6a2e-4fe8-9e42-57d6303411f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.508234 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data" (OuterVolumeSpecName: "config-data") pod "61e61af2-6a2e-4fe8-9e42-57d6303411f8" (UID: "61e61af2-6a2e-4fe8-9e42-57d6303411f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.550956 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw22p\" (UniqueName: \"kubernetes.io/projected/61e61af2-6a2e-4fe8-9e42-57d6303411f8-kube-api-access-hw22p\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.550994 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61e61af2-6a2e-4fe8-9e42-57d6303411f8-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.551006 4902 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.551015 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.551023 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e61af2-6a2e-4fe8-9e42-57d6303411f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.670872 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"676825a2-3e5b-4137-b9fc-337425ff8d09","Type":"ContainerStarted","Data":"945c72aadf0515ec335a90cd3b55e1a48a3e446bc77a7f20e09a75d1c78ddaad"} Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.674511 4902 generic.go:334] "Generic (PLEG): container finished" podID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerID="da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34" exitCode=0 Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.674588 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerDied","Data":"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34"} Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.674618 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7479d98b6d-qm6k2" event={"ID":"61e61af2-6a2e-4fe8-9e42-57d6303411f8","Type":"ContainerDied","Data":"d8c5690a0269bfabdbb1ea9608d3446374b0314787e97a1624041ba062f84084"} Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.674639 4902 scope.go:117] "RemoveContainer" containerID="da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.674791 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7479d98b6d-qm6k2" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.682600 4902 generic.go:334] "Generic (PLEG): container finished" podID="2e924c89-5324-4bc9-9e4d-9005d6d1257d" containerID="dcf43d2ded8473b0968d7a945e64b57ce1fd50b9d7f3fe4750e499d4f452f884" exitCode=0 Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.682643 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b001-account-create-lmts2" event={"ID":"2e924c89-5324-4bc9-9e4d-9005d6d1257d","Type":"ContainerDied","Data":"dcf43d2ded8473b0968d7a945e64b57ce1fd50b9d7f3fe4750e499d4f452f884"} Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.730111 4902 scope.go:117] "RemoveContainer" containerID="d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.750567 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.761309 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7479d98b6d-qm6k2"] Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.769755 4902 scope.go:117] "RemoveContainer" containerID="da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34" Oct 09 14:07:52 crc kubenswrapper[4902]: E1009 14:07:52.770253 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34\": container with ID starting with da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34 not found: ID does not exist" containerID="da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.770300 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34"} err="failed to get container status \"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34\": rpc error: code = NotFound desc = could not find container \"da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34\": container with ID starting with da56b9079deb3a4c21e8b67e250c135ff222953a8fcab35235d50f2794250b34 not found: ID does not exist" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.770332 4902 scope.go:117] "RemoveContainer" containerID="d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e" Oct 09 14:07:52 crc kubenswrapper[4902]: E1009 14:07:52.775778 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e\": container with ID starting with d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e not found: ID does not exist" containerID="d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e" Oct 09 14:07:52 crc kubenswrapper[4902]: I1009 14:07:52.776054 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e"} err="failed to get container status \"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e\": rpc error: code = NotFound desc = could not find container \"d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e\": container with ID starting with d1b6552b807f74b205c16b45dd0d7be0bea91431596d9fdbc3e58f1ff1d3155e not found: ID does not exist" Oct 09 14:07:53 crc kubenswrapper[4902]: I1009 14:07:53.545731 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" path="/var/lib/kubelet/pods/61e61af2-6a2e-4fe8-9e42-57d6303411f8/volumes" Oct 09 14:07:53 crc kubenswrapper[4902]: I1009 14:07:53.679718 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 09 14:07:53 crc kubenswrapper[4902]: I1009 14:07:53.752066 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"676825a2-3e5b-4137-b9fc-337425ff8d09","Type":"ContainerStarted","Data":"d3f6079a10e5218a4ade04f6cf254d447b6ef15b9ecf1d4de045b327f489c560"} Oct 09 14:07:53 crc kubenswrapper[4902]: I1009 14:07:53.752149 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"676825a2-3e5b-4137-b9fc-337425ff8d09","Type":"ContainerStarted","Data":"085952635a65c1be7042cb06241ca1b33cd9e7657fa382906ecff76cc8e34dd0"} Oct 09 14:07:53 crc kubenswrapper[4902]: I1009 14:07:53.799459 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.799437166 podStartE2EDuration="3.799437166s" podCreationTimestamp="2025-10-09 14:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:07:53.779525014 +0000 UTC m=+1020.977384088" watchObservedRunningTime="2025-10-09 14:07:53.799437166 +0000 UTC m=+1020.997296240" Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.303065 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.396158 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjk4\" (UniqueName: \"kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4\") pod \"2e924c89-5324-4bc9-9e4d-9005d6d1257d\" (UID: \"2e924c89-5324-4bc9-9e4d-9005d6d1257d\") " Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.401588 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4" (OuterVolumeSpecName: "kube-api-access-khjk4") pod "2e924c89-5324-4bc9-9e4d-9005d6d1257d" (UID: "2e924c89-5324-4bc9-9e4d-9005d6d1257d"). InnerVolumeSpecName "kube-api-access-khjk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.499165 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjk4\" (UniqueName: \"kubernetes.io/projected/2e924c89-5324-4bc9-9e4d-9005d6d1257d-kube-api-access-khjk4\") on node \"crc\" DevicePath \"\"" Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.779612 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b001-account-create-lmts2" Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.779962 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b001-account-create-lmts2" event={"ID":"2e924c89-5324-4bc9-9e4d-9005d6d1257d","Type":"ContainerDied","Data":"76ad981355de2f265405831f975fa57068fa0c061c6e6ddcd21bc57cbe242e92"} Oct 09 14:07:54 crc kubenswrapper[4902]: I1009 14:07:54.779994 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ad981355de2f265405831f975fa57068fa0c061c6e6ddcd21bc57cbe242e92" Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.015234 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.015539 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-central-agent" containerID="cri-o://28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36" gracePeriod=30 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.016175 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="proxy-httpd" containerID="cri-o://795ea1101233e10ee50699498932c4afde2854ac0fcbcdc1d06658d4f00dda3b" gracePeriod=30 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.016227 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="sg-core" containerID="cri-o://6ed8f0646869dbe8bf1004e112083930c488fbe20b3119a300c1bc66a8b53f0a" gracePeriod=30 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.016267 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-notification-agent" containerID="cri-o://7d306b7d11b30a0734f0bd03c16c987b6e56780bb4fe088be3d12ee5fc69dbf5" gracePeriod=30 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.018982 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 14:07:55 crc kubenswrapper[4902]: E1009 14:07:55.505281 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a579122_9bea_47ee_aeb5_8e08177b4f70.slice/crio-conmon-28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36.scope\": RecentStats: unable to find data in memory cache]" Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799128 4902 generic.go:334] "Generic (PLEG): container finished" podID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerID="795ea1101233e10ee50699498932c4afde2854ac0fcbcdc1d06658d4f00dda3b" exitCode=0 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799379 4902 generic.go:334] "Generic (PLEG): container finished" podID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerID="6ed8f0646869dbe8bf1004e112083930c488fbe20b3119a300c1bc66a8b53f0a" exitCode=2 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799391 4902 generic.go:334] "Generic (PLEG): container finished" podID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerID="28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36" exitCode=0 Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799221 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerDied","Data":"795ea1101233e10ee50699498932c4afde2854ac0fcbcdc1d06658d4f00dda3b"} Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799448 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerDied","Data":"6ed8f0646869dbe8bf1004e112083930c488fbe20b3119a300c1bc66a8b53f0a"} Oct 09 14:07:55 crc kubenswrapper[4902]: I1009 14:07:55.799462 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerDied","Data":"28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36"} Oct 09 14:07:56 crc kubenswrapper[4902]: I1009 14:07:56.138574 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 14:07:56 crc kubenswrapper[4902]: I1009 14:07:56.813781 4902 generic.go:334] "Generic (PLEG): container finished" podID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerID="7d306b7d11b30a0734f0bd03c16c987b6e56780bb4fe088be3d12ee5fc69dbf5" exitCode=0 Oct 09 14:07:56 crc kubenswrapper[4902]: I1009 14:07:56.813835 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerDied","Data":"7d306b7d11b30a0734f0bd03c16c987b6e56780bb4fe088be3d12ee5fc69dbf5"} Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.804139 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.865552 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a579122-9bea-47ee-aeb5-8e08177b4f70","Type":"ContainerDied","Data":"9e9e224dbd8df3257939dfb37599673cfa42a6781a1f4d513fce45fda81c8e58"} Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.865620 4902 scope.go:117] "RemoveContainer" containerID="795ea1101233e10ee50699498932c4afde2854ac0fcbcdc1d06658d4f00dda3b" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.865803 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.879982 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" event={"ID":"cbeddafe-9239-4c02-8f7d-f4a341ee97b1","Type":"ContainerStarted","Data":"4611e4a7d63368eb1993e398373d1774eaf3281d47466a275e578c08d76a0f05"} Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.901503 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" podStartSLOduration=1.90860952 podStartE2EDuration="14.901481066s" podCreationTimestamp="2025-10-09 14:07:45 +0000 UTC" firstStartedPulling="2025-10-09 14:07:46.567951854 +0000 UTC m=+1013.765810928" lastFinishedPulling="2025-10-09 14:07:59.56082341 +0000 UTC m=+1026.758682474" observedRunningTime="2025-10-09 14:07:59.899040624 +0000 UTC m=+1027.096899698" watchObservedRunningTime="2025-10-09 14:07:59.901481066 +0000 UTC m=+1027.099340130" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.909997 4902 scope.go:117] "RemoveContainer" containerID="6ed8f0646869dbe8bf1004e112083930c488fbe20b3119a300c1bc66a8b53f0a" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910489 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910544 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910611 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910716 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910842 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpqwf\" (UniqueName: \"kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910862 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.910890 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd\") pod \"5a579122-9bea-47ee-aeb5-8e08177b4f70\" (UID: \"5a579122-9bea-47ee-aeb5-8e08177b4f70\") " Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.911704 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.912345 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.917491 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts" (OuterVolumeSpecName: "scripts") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.919003 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf" (OuterVolumeSpecName: "kube-api-access-gpqwf") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "kube-api-access-gpqwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.938802 4902 scope.go:117] "RemoveContainer" containerID="7d306b7d11b30a0734f0bd03c16c987b6e56780bb4fe088be3d12ee5fc69dbf5" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.947571 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.960781 4902 scope.go:117] "RemoveContainer" containerID="28ec4b4c503df200a179d5572696eef453ed1285e4b9c09029803be6f285ab36" Oct 09 14:07:59 crc kubenswrapper[4902]: I1009 14:07:59.996950 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013471 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013508 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpqwf\" (UniqueName: \"kubernetes.io/projected/5a579122-9bea-47ee-aeb5-8e08177b4f70-kube-api-access-gpqwf\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013523 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013535 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013546 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.013557 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a579122-9bea-47ee-aeb5-8e08177b4f70-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.028608 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data" (OuterVolumeSpecName: "config-data") pod "5a579122-9bea-47ee-aeb5-8e08177b4f70" (UID: "5a579122-9bea-47ee-aeb5-8e08177b4f70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.115038 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a579122-9bea-47ee-aeb5-8e08177b4f70-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.200681 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.208671 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.221709 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222077 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-central-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222094 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-central-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222110 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="proxy-httpd" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222118 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="proxy-httpd" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222157 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-notification-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222163 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-notification-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222189 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e924c89-5324-4bc9-9e4d-9005d6d1257d" containerName="mariadb-account-create" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222197 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e924c89-5324-4bc9-9e4d-9005d6d1257d" containerName="mariadb-account-create" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222211 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api-log" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222218 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api-log" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222232 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="sg-core" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222239 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="sg-core" Oct 09 14:08:00 crc kubenswrapper[4902]: E1009 14:08:00.222256 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222263 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222466 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222486 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="proxy-httpd" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222498 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-notification-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222509 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="sg-core" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222518 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e924c89-5324-4bc9-9e4d-9005d6d1257d" containerName="mariadb-account-create" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222531 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e61af2-6a2e-4fe8-9e42-57d6303411f8" containerName="barbican-api-log" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.222542 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" containerName="ceilometer-central-agent" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.229433 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.240242 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.240649 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.271641 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319159 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319235 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319276 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319313 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26lf\" (UniqueName: \"kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319361 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319397 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.319486 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421508 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421581 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421616 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421642 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26lf\" (UniqueName: \"kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421693 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421727 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.421793 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.422370 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.422615 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.426436 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.428482 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.432720 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.432714 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.444939 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26lf\" (UniqueName: \"kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf\") pod \"ceilometer-0\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " pod="openstack/ceilometer-0" Oct 09 14:08:00 crc kubenswrapper[4902]: I1009 14:08:00.562169 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:01 crc kubenswrapper[4902]: I1009 14:08:01.042144 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:01 crc kubenswrapper[4902]: W1009 14:08:01.045133 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod890a478d_3570_48dc_b1ee_60f64b590d8b.slice/crio-cad7bec08c36c2bf325bcd8304fa67404536b42d0a0681f579eae3764f1dbcd8 WatchSource:0}: Error finding container cad7bec08c36c2bf325bcd8304fa67404536b42d0a0681f579eae3764f1dbcd8: Status 404 returned error can't find the container with id cad7bec08c36c2bf325bcd8304fa67404536b42d0a0681f579eae3764f1dbcd8 Oct 09 14:08:01 crc kubenswrapper[4902]: I1009 14:08:01.369893 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 14:08:01 crc kubenswrapper[4902]: I1009 14:08:01.523103 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a579122-9bea-47ee-aeb5-8e08177b4f70" path="/var/lib/kubelet/pods/5a579122-9bea-47ee-aeb5-8e08177b4f70/volumes" Oct 09 14:08:01 crc kubenswrapper[4902]: I1009 14:08:01.900254 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerStarted","Data":"cad7bec08c36c2bf325bcd8304fa67404536b42d0a0681f579eae3764f1dbcd8"} Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.471731 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.472231 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" containerName="kube-state-metrics" containerID="cri-o://3d9139f332643896c9a5dfb4564992a3f14e8a4a98d85e66efe7c26be95cfe1f" gracePeriod=30 Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.913038 4902 generic.go:334] "Generic (PLEG): container finished" podID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" containerID="3d9139f332643896c9a5dfb4564992a3f14e8a4a98d85e66efe7c26be95cfe1f" exitCode=2 Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.913232 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dc74bb1-5a59-4876-83ff-09c1a2cb6042","Type":"ContainerDied","Data":"3d9139f332643896c9a5dfb4564992a3f14e8a4a98d85e66efe7c26be95cfe1f"} Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.913391 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9dc74bb1-5a59-4876-83ff-09c1a2cb6042","Type":"ContainerDied","Data":"9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721"} Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.913421 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b6277c3f0e3fe9b0e97ab81a0a7b2a936669693767fae17d4b50dc426089721" Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.916994 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerStarted","Data":"cddd78a158b98ddf1f8f2689b6ef9cfe249095014b92df2f42af9e4854150458"} Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.917040 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerStarted","Data":"8a0daa532a5a126fdad39197e5c05a2d0425c24bc344ea106d36ca8c8fd9e25c"} Oct 09 14:08:02 crc kubenswrapper[4902]: I1009 14:08:02.924967 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:08:03 crc kubenswrapper[4902]: I1009 14:08:03.067752 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rjwv\" (UniqueName: \"kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv\") pod \"9dc74bb1-5a59-4876-83ff-09c1a2cb6042\" (UID: \"9dc74bb1-5a59-4876-83ff-09c1a2cb6042\") " Oct 09 14:08:03 crc kubenswrapper[4902]: I1009 14:08:03.073827 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv" (OuterVolumeSpecName: "kube-api-access-2rjwv") pod "9dc74bb1-5a59-4876-83ff-09c1a2cb6042" (UID: "9dc74bb1-5a59-4876-83ff-09c1a2cb6042"). InnerVolumeSpecName "kube-api-access-2rjwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:03 crc kubenswrapper[4902]: I1009 14:08:03.170762 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rjwv\" (UniqueName: \"kubernetes.io/projected/9dc74bb1-5a59-4876-83ff-09c1a2cb6042-kube-api-access-2rjwv\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:03 crc kubenswrapper[4902]: I1009 14:08:03.925633 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:08:03 crc kubenswrapper[4902]: I1009 14:08:03.997132 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.009995 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.020803 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:04 crc kubenswrapper[4902]: E1009 14:08:04.021326 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" containerName="kube-state-metrics" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.021374 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" containerName="kube-state-metrics" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.021936 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" containerName="kube-state-metrics" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.022824 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.025686 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.025835 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.030021 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.193261 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfbvq\" (UniqueName: \"kubernetes.io/projected/dde7a697-a373-4e6a-8535-d7768f569e18-kube-api-access-zfbvq\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.193466 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.193541 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.193582 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.295492 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.295552 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.295580 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.295668 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfbvq\" (UniqueName: \"kubernetes.io/projected/dde7a697-a373-4e6a-8535-d7768f569e18-kube-api-access-zfbvq\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.301919 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.304140 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.307630 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/dde7a697-a373-4e6a-8535-d7768f569e18-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.324807 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfbvq\" (UniqueName: \"kubernetes.io/projected/dde7a697-a373-4e6a-8535-d7768f569e18-kube-api-access-zfbvq\") pod \"kube-state-metrics-0\" (UID: \"dde7a697-a373-4e6a-8535-d7768f569e18\") " pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.344954 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 14:08:04 crc kubenswrapper[4902]: I1009 14:08:04.608924 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:05 crc kubenswrapper[4902]: I1009 14:08:05.530057 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc74bb1-5a59-4876-83ff-09c1a2cb6042" path="/var/lib/kubelet/pods/9dc74bb1-5a59-4876-83ff-09c1a2cb6042/volumes" Oct 09 14:08:05 crc kubenswrapper[4902]: I1009 14:08:05.594458 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 14:08:05 crc kubenswrapper[4902]: W1009 14:08:05.598113 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde7a697_a373_4e6a_8535_d7768f569e18.slice/crio-d1d72f379b44759455f5682f7986c6e5f3f66f86d46f72f8f574de9ada1a1ca3 WatchSource:0}: Error finding container d1d72f379b44759455f5682f7986c6e5f3f66f86d46f72f8f574de9ada1a1ca3: Status 404 returned error can't find the container with id d1d72f379b44759455f5682f7986c6e5f3f66f86d46f72f8f574de9ada1a1ca3 Oct 09 14:08:05 crc kubenswrapper[4902]: I1009 14:08:05.945830 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerStarted","Data":"11a4fe8f6f970158afeb48e05a9dfe9fb16630bf040486123e20fe38dc9cd951"} Oct 09 14:08:05 crc kubenswrapper[4902]: I1009 14:08:05.947614 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dde7a697-a373-4e6a-8535-d7768f569e18","Type":"ContainerStarted","Data":"d1d72f379b44759455f5682f7986c6e5f3f66f86d46f72f8f574de9ada1a1ca3"} Oct 09 14:08:06 crc kubenswrapper[4902]: I1009 14:08:06.976514 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"dde7a697-a373-4e6a-8535-d7768f569e18","Type":"ContainerStarted","Data":"21b675c0c2276709b77fcb4fbcce7350b1722f65b562cb18d2033d6c5b5238f8"} Oct 09 14:08:06 crc kubenswrapper[4902]: I1009 14:08:06.977018 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.002211 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.602593073 podStartE2EDuration="4.002191931s" podCreationTimestamp="2025-10-09 14:08:03 +0000 UTC" firstStartedPulling="2025-10-09 14:08:05.603911937 +0000 UTC m=+1032.801771001" lastFinishedPulling="2025-10-09 14:08:06.003510795 +0000 UTC m=+1033.201369859" observedRunningTime="2025-10-09 14:08:06.995809641 +0000 UTC m=+1034.193668715" watchObservedRunningTime="2025-10-09 14:08:07.002191931 +0000 UTC m=+1034.200050995" Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.987349 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerStarted","Data":"3e1f224c0fc1cf9c09b66b55ac2ac08e10867b6b2cc29871fbcdc2e0f5e0bcb8"} Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.987616 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-central-agent" containerID="cri-o://8a0daa532a5a126fdad39197e5c05a2d0425c24bc344ea106d36ca8c8fd9e25c" gracePeriod=30 Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.987629 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="proxy-httpd" containerID="cri-o://3e1f224c0fc1cf9c09b66b55ac2ac08e10867b6b2cc29871fbcdc2e0f5e0bcb8" gracePeriod=30 Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.987669 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-notification-agent" containerID="cri-o://cddd78a158b98ddf1f8f2689b6ef9cfe249095014b92df2f42af9e4854150458" gracePeriod=30 Oct 09 14:08:07 crc kubenswrapper[4902]: I1009 14:08:07.987686 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="sg-core" containerID="cri-o://11a4fe8f6f970158afeb48e05a9dfe9fb16630bf040486123e20fe38dc9cd951" gracePeriod=30 Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.019073 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.028517552 podStartE2EDuration="8.019053697s" podCreationTimestamp="2025-10-09 14:08:00 +0000 UTC" firstStartedPulling="2025-10-09 14:08:01.047525372 +0000 UTC m=+1028.245384436" lastFinishedPulling="2025-10-09 14:08:07.038061517 +0000 UTC m=+1034.235920581" observedRunningTime="2025-10-09 14:08:08.013713838 +0000 UTC m=+1035.211572912" watchObservedRunningTime="2025-10-09 14:08:08.019053697 +0000 UTC m=+1035.216912761" Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.998634 4902 generic.go:334] "Generic (PLEG): container finished" podID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerID="3e1f224c0fc1cf9c09b66b55ac2ac08e10867b6b2cc29871fbcdc2e0f5e0bcb8" exitCode=0 Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.998961 4902 generic.go:334] "Generic (PLEG): container finished" podID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerID="11a4fe8f6f970158afeb48e05a9dfe9fb16630bf040486123e20fe38dc9cd951" exitCode=2 Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.998709 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerDied","Data":"3e1f224c0fc1cf9c09b66b55ac2ac08e10867b6b2cc29871fbcdc2e0f5e0bcb8"} Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.999016 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerDied","Data":"11a4fe8f6f970158afeb48e05a9dfe9fb16630bf040486123e20fe38dc9cd951"} Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.999034 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerDied","Data":"cddd78a158b98ddf1f8f2689b6ef9cfe249095014b92df2f42af9e4854150458"} Oct 09 14:08:08 crc kubenswrapper[4902]: I1009 14:08:08.998977 4902 generic.go:334] "Generic (PLEG): container finished" podID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerID="cddd78a158b98ddf1f8f2689b6ef9cfe249095014b92df2f42af9e4854150458" exitCode=0 Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.012434 4902 generic.go:334] "Generic (PLEG): container finished" podID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerID="8a0daa532a5a126fdad39197e5c05a2d0425c24bc344ea106d36ca8c8fd9e25c" exitCode=0 Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.012512 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerDied","Data":"8a0daa532a5a126fdad39197e5c05a2d0425c24bc344ea106d36ca8c8fd9e25c"} Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.274106 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346548 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346602 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346721 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346773 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346805 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26lf\" (UniqueName: \"kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346838 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.346863 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd\") pod \"890a478d-3570-48dc-b1ee-60f64b590d8b\" (UID: \"890a478d-3570-48dc-b1ee-60f64b590d8b\") " Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.347303 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.347353 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.352757 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts" (OuterVolumeSpecName: "scripts") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.353400 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf" (OuterVolumeSpecName: "kube-api-access-r26lf") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "kube-api-access-r26lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.389572 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.433784 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448777 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448815 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448825 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26lf\" (UniqueName: \"kubernetes.io/projected/890a478d-3570-48dc-b1ee-60f64b590d8b-kube-api-access-r26lf\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448835 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448844 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/890a478d-3570-48dc-b1ee-60f64b590d8b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.448851 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.468202 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data" (OuterVolumeSpecName: "config-data") pod "890a478d-3570-48dc-b1ee-60f64b590d8b" (UID: "890a478d-3570-48dc-b1ee-60f64b590d8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:10 crc kubenswrapper[4902]: I1009 14:08:10.550629 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/890a478d-3570-48dc-b1ee-60f64b590d8b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.027978 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"890a478d-3570-48dc-b1ee-60f64b590d8b","Type":"ContainerDied","Data":"cad7bec08c36c2bf325bcd8304fa67404536b42d0a0681f579eae3764f1dbcd8"} Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.028335 4902 scope.go:117] "RemoveContainer" containerID="3e1f224c0fc1cf9c09b66b55ac2ac08e10867b6b2cc29871fbcdc2e0f5e0bcb8" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.028152 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.090794 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.097293 4902 scope.go:117] "RemoveContainer" containerID="11a4fe8f6f970158afeb48e05a9dfe9fb16630bf040486123e20fe38dc9cd951" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.100443 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.122778 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:11 crc kubenswrapper[4902]: E1009 14:08:11.123321 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-central-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.123344 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-central-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: E1009 14:08:11.123364 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="sg-core" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.123374 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="sg-core" Oct 09 14:08:11 crc kubenswrapper[4902]: E1009 14:08:11.123394 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-notification-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.123401 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-notification-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: E1009 14:08:11.123466 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="proxy-httpd" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.123476 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="proxy-httpd" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.123970 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="sg-core" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.124001 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="proxy-httpd" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.124014 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-notification-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.124037 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" containerName="ceilometer-central-agent" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.131056 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.132549 4902 scope.go:117] "RemoveContainer" containerID="cddd78a158b98ddf1f8f2689b6ef9cfe249095014b92df2f42af9e4854150458" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.133552 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.133735 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.135697 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.136875 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.153333 4902 scope.go:117] "RemoveContainer" containerID="8a0daa532a5a126fdad39197e5c05a2d0425c24bc344ea106d36ca8c8fd9e25c" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.265617 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkhk\" (UniqueName: \"kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.265904 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266130 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266266 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266310 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266333 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266596 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.266656 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368016 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368106 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368162 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfkhk\" (UniqueName: \"kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368222 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368255 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368313 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368343 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368367 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368585 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.368816 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.373956 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.374081 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.374392 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.374614 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.381380 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.384089 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfkhk\" (UniqueName: \"kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk\") pod \"ceilometer-0\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.462594 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.524187 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890a478d-3570-48dc-b1ee-60f64b590d8b" path="/var/lib/kubelet/pods/890a478d-3570-48dc-b1ee-60f64b590d8b/volumes" Oct 09 14:08:11 crc kubenswrapper[4902]: I1009 14:08:11.890769 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:11 crc kubenswrapper[4902]: W1009 14:08:11.891830 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82b65d50_0474_454d_92a1_05cb8a785834.slice/crio-fbe9120a94e8eed8093675a99b7112e7ea7d369e9e9e0ca14a1fe07972c3b659 WatchSource:0}: Error finding container fbe9120a94e8eed8093675a99b7112e7ea7d369e9e9e0ca14a1fe07972c3b659: Status 404 returned error can't find the container with id fbe9120a94e8eed8093675a99b7112e7ea7d369e9e9e0ca14a1fe07972c3b659 Oct 09 14:08:12 crc kubenswrapper[4902]: I1009 14:08:12.038192 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerStarted","Data":"fbe9120a94e8eed8093675a99b7112e7ea7d369e9e9e0ca14a1fe07972c3b659"} Oct 09 14:08:13 crc kubenswrapper[4902]: I1009 14:08:13.051138 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerStarted","Data":"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247"} Oct 09 14:08:13 crc kubenswrapper[4902]: I1009 14:08:13.053994 4902 generic.go:334] "Generic (PLEG): container finished" podID="cbeddafe-9239-4c02-8f7d-f4a341ee97b1" containerID="4611e4a7d63368eb1993e398373d1774eaf3281d47466a275e578c08d76a0f05" exitCode=0 Oct 09 14:08:13 crc kubenswrapper[4902]: I1009 14:08:13.054046 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" event={"ID":"cbeddafe-9239-4c02-8f7d-f4a341ee97b1","Type":"ContainerDied","Data":"4611e4a7d63368eb1993e398373d1774eaf3281d47466a275e578c08d76a0f05"} Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.064805 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerStarted","Data":"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738"} Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.360200 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.502366 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.630607 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zd26\" (UniqueName: \"kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26\") pod \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.631207 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle\") pod \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.631383 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data\") pod \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.631478 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts\") pod \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\" (UID: \"cbeddafe-9239-4c02-8f7d-f4a341ee97b1\") " Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.636493 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts" (OuterVolumeSpecName: "scripts") pod "cbeddafe-9239-4c02-8f7d-f4a341ee97b1" (UID: "cbeddafe-9239-4c02-8f7d-f4a341ee97b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.654404 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26" (OuterVolumeSpecName: "kube-api-access-2zd26") pod "cbeddafe-9239-4c02-8f7d-f4a341ee97b1" (UID: "cbeddafe-9239-4c02-8f7d-f4a341ee97b1"). InnerVolumeSpecName "kube-api-access-2zd26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.661860 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data" (OuterVolumeSpecName: "config-data") pod "cbeddafe-9239-4c02-8f7d-f4a341ee97b1" (UID: "cbeddafe-9239-4c02-8f7d-f4a341ee97b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.679713 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbeddafe-9239-4c02-8f7d-f4a341ee97b1" (UID: "cbeddafe-9239-4c02-8f7d-f4a341ee97b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.735709 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.735762 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.735772 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:14 crc kubenswrapper[4902]: I1009 14:08:14.735783 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zd26\" (UniqueName: \"kubernetes.io/projected/cbeddafe-9239-4c02-8f7d-f4a341ee97b1-kube-api-access-2zd26\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.075647 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" event={"ID":"cbeddafe-9239-4c02-8f7d-f4a341ee97b1","Type":"ContainerDied","Data":"035c59586aac2dd23ef4b765e7e5a0cc823f7c9b8e1fef87b6561389c5ab3c09"} Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.075684 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035c59586aac2dd23ef4b765e7e5a0cc823f7c9b8e1fef87b6561389c5ab3c09" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.075699 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bs8sb" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.078593 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerStarted","Data":"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506"} Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.176508 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 14:08:15 crc kubenswrapper[4902]: E1009 14:08:15.176960 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbeddafe-9239-4c02-8f7d-f4a341ee97b1" containerName="nova-cell0-conductor-db-sync" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.176980 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbeddafe-9239-4c02-8f7d-f4a341ee97b1" containerName="nova-cell0-conductor-db-sync" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.177206 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbeddafe-9239-4c02-8f7d-f4a341ee97b1" containerName="nova-cell0-conductor-db-sync" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.177962 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.180476 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qbb4g" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.181083 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.186967 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.244571 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.245523 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.245632 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcn9\" (UniqueName: \"kubernetes.io/projected/61f63f9a-89ab-45fa-b62f-e93f826423a9-kube-api-access-pkcn9\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.346789 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.347145 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.347246 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkcn9\" (UniqueName: \"kubernetes.io/projected/61f63f9a-89ab-45fa-b62f-e93f826423a9-kube-api-access-pkcn9\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.352119 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.352395 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f63f9a-89ab-45fa-b62f-e93f826423a9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.366333 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkcn9\" (UniqueName: \"kubernetes.io/projected/61f63f9a-89ab-45fa-b62f-e93f826423a9-kube-api-access-pkcn9\") pod \"nova-cell0-conductor-0\" (UID: \"61f63f9a-89ab-45fa-b62f-e93f826423a9\") " pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.496403 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:15 crc kubenswrapper[4902]: I1009 14:08:15.945703 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 14:08:16 crc kubenswrapper[4902]: I1009 14:08:16.060998 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:16 crc kubenswrapper[4902]: I1009 14:08:16.090648 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerStarted","Data":"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653"} Oct 09 14:08:16 crc kubenswrapper[4902]: I1009 14:08:16.092828 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"61f63f9a-89ab-45fa-b62f-e93f826423a9","Type":"ContainerStarted","Data":"99388e68db90412870683c4b2dd7abb1a5a2495841b594dd9e62b12e59ab84a1"} Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.102935 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"61f63f9a-89ab-45fa-b62f-e93f826423a9","Type":"ContainerStarted","Data":"872f2137dbf39487d52edb0522e6318077f10c68b83c92e1f3700b1b914430b4"} Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.104741 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="proxy-httpd" containerID="cri-o://1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653" gracePeriod=30 Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.104882 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="sg-core" containerID="cri-o://5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506" gracePeriod=30 Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.104974 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-notification-agent" containerID="cri-o://2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738" gracePeriod=30 Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.102932 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-central-agent" containerID="cri-o://4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247" gracePeriod=30 Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.105684 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.105757 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.144510 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.213190818 podStartE2EDuration="6.144489565s" podCreationTimestamp="2025-10-09 14:08:11 +0000 UTC" firstStartedPulling="2025-10-09 14:08:11.90966823 +0000 UTC m=+1039.107527294" lastFinishedPulling="2025-10-09 14:08:15.840966977 +0000 UTC m=+1043.038826041" observedRunningTime="2025-10-09 14:08:17.135519962 +0000 UTC m=+1044.333379026" watchObservedRunningTime="2025-10-09 14:08:17.144489565 +0000 UTC m=+1044.342348639" Oct 09 14:08:17 crc kubenswrapper[4902]: I1009 14:08:17.158841 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.158823157 podStartE2EDuration="2.158823157s" podCreationTimestamp="2025-10-09 14:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:17.153758398 +0000 UTC m=+1044.351617472" watchObservedRunningTime="2025-10-09 14:08:17.158823157 +0000 UTC m=+1044.356682221" Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.116679 4902 generic.go:334] "Generic (PLEG): container finished" podID="82b65d50-0474-454d-92a1-05cb8a785834" containerID="1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653" exitCode=0 Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.117022 4902 generic.go:334] "Generic (PLEG): container finished" podID="82b65d50-0474-454d-92a1-05cb8a785834" containerID="5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506" exitCode=2 Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.117041 4902 generic.go:334] "Generic (PLEG): container finished" podID="82b65d50-0474-454d-92a1-05cb8a785834" containerID="2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738" exitCode=0 Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.118199 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerDied","Data":"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653"} Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.118226 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerDied","Data":"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506"} Oct 09 14:08:18 crc kubenswrapper[4902]: I1009 14:08:18.118237 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerDied","Data":"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738"} Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.660305 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.726288 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.726362 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727186 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727272 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727345 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727449 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727516 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfkhk\" (UniqueName: \"kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727544 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml\") pod \"82b65d50-0474-454d-92a1-05cb8a785834\" (UID: \"82b65d50-0474-454d-92a1-05cb8a785834\") " Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727580 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.727899 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.728131 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.733005 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk" (OuterVolumeSpecName: "kube-api-access-pfkhk") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "kube-api-access-pfkhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.748564 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts" (OuterVolumeSpecName: "scripts") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.756472 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.783350 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.807338 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.825976 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data" (OuterVolumeSpecName: "config-data") pod "82b65d50-0474-454d-92a1-05cb8a785834" (UID: "82b65d50-0474-454d-92a1-05cb8a785834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829610 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b65d50-0474-454d-92a1-05cb8a785834-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829655 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829697 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfkhk\" (UniqueName: \"kubernetes.io/projected/82b65d50-0474-454d-92a1-05cb8a785834-kube-api-access-pfkhk\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829711 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829772 4902 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829827 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:19 crc kubenswrapper[4902]: I1009 14:08:19.829865 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b65d50-0474-454d-92a1-05cb8a785834-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.142553 4902 generic.go:334] "Generic (PLEG): container finished" podID="82b65d50-0474-454d-92a1-05cb8a785834" containerID="4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247" exitCode=0 Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.142636 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.142636 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerDied","Data":"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247"} Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.142998 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b65d50-0474-454d-92a1-05cb8a785834","Type":"ContainerDied","Data":"fbe9120a94e8eed8093675a99b7112e7ea7d369e9e9e0ca14a1fe07972c3b659"} Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.143021 4902 scope.go:117] "RemoveContainer" containerID="1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.166221 4902 scope.go:117] "RemoveContainer" containerID="5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.183651 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.188973 4902 scope.go:117] "RemoveContainer" containerID="2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.199541 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207182 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.207521 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-central-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207536 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-central-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.207547 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="sg-core" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207553 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="sg-core" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.207578 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-notification-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207583 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-notification-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.207607 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="proxy-httpd" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207614 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="proxy-httpd" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207765 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-notification-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207785 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="sg-core" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207795 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="ceilometer-central-agent" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.207806 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b65d50-0474-454d-92a1-05cb8a785834" containerName="proxy-httpd" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.209303 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.211597 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.211806 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.215897 4902 scope.go:117] "RemoveContainer" containerID="4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.218216 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.228038 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.252577 4902 scope.go:117] "RemoveContainer" containerID="1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.253302 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653\": container with ID starting with 1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653 not found: ID does not exist" containerID="1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.253635 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653"} err="failed to get container status \"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653\": rpc error: code = NotFound desc = could not find container \"1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653\": container with ID starting with 1bc67067cdf4582fdb8ec25709e4b7f0bfa4a2576713203ac0e08e01d9904653 not found: ID does not exist" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.253666 4902 scope.go:117] "RemoveContainer" containerID="5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.261181 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506\": container with ID starting with 5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506 not found: ID does not exist" containerID="5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.261252 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506"} err="failed to get container status \"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506\": rpc error: code = NotFound desc = could not find container \"5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506\": container with ID starting with 5b0707af8a28730868c82e8cf547457fb5287380401373212932ec2355943506 not found: ID does not exist" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.261281 4902 scope.go:117] "RemoveContainer" containerID="2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.262393 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738\": container with ID starting with 2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738 not found: ID does not exist" containerID="2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.262434 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738"} err="failed to get container status \"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738\": rpc error: code = NotFound desc = could not find container \"2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738\": container with ID starting with 2b619d8bd59e67f5a2fe198777c1803365cf6c089dd09fdc5cf55d6b636dd738 not found: ID does not exist" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.262452 4902 scope.go:117] "RemoveContainer" containerID="4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247" Oct 09 14:08:20 crc kubenswrapper[4902]: E1009 14:08:20.263377 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247\": container with ID starting with 4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247 not found: ID does not exist" containerID="4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.263541 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247"} err="failed to get container status \"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247\": rpc error: code = NotFound desc = could not find container \"4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247\": container with ID starting with 4db62169ab8594fd9b15825ea76e7b7c7fc276b8e2c91d77724180e88d9ca247 not found: ID does not exist" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352441 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352496 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352525 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352549 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352576 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352666 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d94kl\" (UniqueName: \"kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352715 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.352735 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455113 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455189 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455210 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455233 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455284 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455311 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455361 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.455526 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d94kl\" (UniqueName: \"kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.456119 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.456374 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.460568 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.461590 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.462194 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.462532 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.464469 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.473472 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d94kl\" (UniqueName: \"kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl\") pod \"ceilometer-0\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.530128 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:08:20 crc kubenswrapper[4902]: I1009 14:08:20.958016 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:08:20 crc kubenswrapper[4902]: W1009 14:08:20.960366 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e06434_8159_49be_bef4_3b0646427c20.slice/crio-3f27778818b73ebfd7cc3d19b81a9098120a035308bb1bcdfef4b9fbb41bfce7 WatchSource:0}: Error finding container 3f27778818b73ebfd7cc3d19b81a9098120a035308bb1bcdfef4b9fbb41bfce7: Status 404 returned error can't find the container with id 3f27778818b73ebfd7cc3d19b81a9098120a035308bb1bcdfef4b9fbb41bfce7 Oct 09 14:08:21 crc kubenswrapper[4902]: I1009 14:08:21.156361 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerStarted","Data":"3f27778818b73ebfd7cc3d19b81a9098120a035308bb1bcdfef4b9fbb41bfce7"} Oct 09 14:08:21 crc kubenswrapper[4902]: I1009 14:08:21.525923 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b65d50-0474-454d-92a1-05cb8a785834" path="/var/lib/kubelet/pods/82b65d50-0474-454d-92a1-05cb8a785834/volumes" Oct 09 14:08:23 crc kubenswrapper[4902]: I1009 14:08:23.173740 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerStarted","Data":"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4"} Oct 09 14:08:24 crc kubenswrapper[4902]: I1009 14:08:24.192321 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerStarted","Data":"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0"} Oct 09 14:08:24 crc kubenswrapper[4902]: I1009 14:08:24.192646 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerStarted","Data":"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855"} Oct 09 14:08:25 crc kubenswrapper[4902]: I1009 14:08:25.528630 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.024915 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qc4zz"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.027036 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.032101 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.032376 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.038661 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qc4zz"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.168597 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.168665 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.168700 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8hkp\" (UniqueName: \"kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.168836 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.234927 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerStarted","Data":"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83"} Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.235591 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.271678 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.271823 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.271864 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.271898 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8hkp\" (UniqueName: \"kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.281265 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.293181 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.305981 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.308380 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.320458 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.337789 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.358151 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8hkp\" (UniqueName: \"kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp\") pod \"nova-cell0-cell-mapping-qc4zz\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.366519 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.086639471 podStartE2EDuration="6.366499052s" podCreationTimestamp="2025-10-09 14:08:20 +0000 UTC" firstStartedPulling="2025-10-09 14:08:20.963092361 +0000 UTC m=+1048.160951425" lastFinishedPulling="2025-10-09 14:08:25.242951942 +0000 UTC m=+1052.440811006" observedRunningTime="2025-10-09 14:08:26.363522165 +0000 UTC m=+1053.561381239" watchObservedRunningTime="2025-10-09 14:08:26.366499052 +0000 UTC m=+1053.564358136" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.370848 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.373157 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.373247 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwh7j\" (UniqueName: \"kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.373280 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.373447 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.394159 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.395676 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.415167 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.466146 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475096 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475183 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475217 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475278 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9q7\" (UniqueName: \"kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475314 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwh7j\" (UniqueName: \"kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475395 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.475517 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.476645 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.487272 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.490195 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.508250 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.517131 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwh7j\" (UniqueName: \"kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j\") pod \"nova-api-0\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.540432 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.541945 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.545493 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.578458 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.578553 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9q7\" (UniqueName: \"kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.578685 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.589546 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.598282 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.604107 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.604180 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9q7\" (UniqueName: \"kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.680039 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.680555 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2d79\" (UniqueName: \"kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.687492 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.700977 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.702857 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.707272 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.733738 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.813499 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.839939 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840039 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d79\" (UniqueName: \"kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840119 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840209 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840309 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840393 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfddw\" (UniqueName: \"kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840517 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.840769 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.844294 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.846930 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.849473 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.856213 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.869861 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d79\" (UniqueName: \"kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79\") pod \"nova-scheduler-0\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.893040 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.903062 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949462 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949509 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949648 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949680 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gd2x\" (UniqueName: \"kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949702 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949726 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949759 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.950170 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.949783 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.950338 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfddw\" (UniqueName: \"kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.950439 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.954550 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.958534 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:26 crc kubenswrapper[4902]: I1009 14:08:26.971741 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfddw\" (UniqueName: \"kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw\") pod \"nova-metadata-0\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " pod="openstack/nova-metadata-0" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062092 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gd2x\" (UniqueName: \"kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062140 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062174 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062197 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062228 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.062263 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.063429 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.063757 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.063872 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.064073 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.064325 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.066873 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.087390 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gd2x\" (UniqueName: \"kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x\") pod \"dnsmasq-dns-845d6d6f59-425tz\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.137660 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qc4zz"] Oct 09 14:08:27 crc kubenswrapper[4902]: W1009 14:08:27.167459 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e9b526e_bd2b_4710_9225_669035576d7c.slice/crio-197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68 WatchSource:0}: Error finding container 197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68: Status 404 returned error can't find the container with id 197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68 Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.185028 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.297155 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qc4zz" event={"ID":"1e9b526e-bd2b-4710-9225-669035576d7c","Type":"ContainerStarted","Data":"197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68"} Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.301713 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ssjrs"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.302831 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.311649 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.312003 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.326615 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ssjrs"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.385591 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.385640 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.385665 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcfn\" (UniqueName: \"kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.385728 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.469131 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.490303 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.490457 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.490480 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.490502 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcfn\" (UniqueName: \"kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.499173 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.500121 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.502178 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.519023 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcfn\" (UniqueName: \"kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn\") pod \"nova-cell1-conductor-db-sync-ssjrs\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.584478 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.596079 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.680877 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.832131 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:27 crc kubenswrapper[4902]: I1009 14:08:27.845008 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:08:27 crc kubenswrapper[4902]: W1009 14:08:27.846574 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04f9588_77ae_4927_b59d_55db8d7764ae.slice/crio-3cdf2e728bba544469810c5bf924e1ea4b2df5a0e0f94ba9f5904fd60451e3b3 WatchSource:0}: Error finding container 3cdf2e728bba544469810c5bf924e1ea4b2df5a0e0f94ba9f5904fd60451e3b3: Status 404 returned error can't find the container with id 3cdf2e728bba544469810c5bf924e1ea4b2df5a0e0f94ba9f5904fd60451e3b3 Oct 09 14:08:27 crc kubenswrapper[4902]: W1009 14:08:27.849833 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7bf8d6_8d86_4b2d_8a31_268090d3e11a.slice/crio-5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f WatchSource:0}: Error finding container 5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f: Status 404 returned error can't find the container with id 5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.144688 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ssjrs"] Oct 09 14:08:28 crc kubenswrapper[4902]: W1009 14:08:28.219798 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc21b931_be3c_442f_b886_3f245f88e079.slice/crio-93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669 WatchSource:0}: Error finding container 93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669: Status 404 returned error can't find the container with id 93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669 Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.315613 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b42ec27-52a6-459c-a266-7a6dba2acfab","Type":"ContainerStarted","Data":"fef3c1693a25564277ec5da1f63d050c9b6b7d4265f389a6cc55821d21bf8901"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.317612 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerStarted","Data":"3cdf2e728bba544469810c5bf924e1ea4b2df5a0e0f94ba9f5904fd60451e3b3"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.320200 4902 generic.go:334] "Generic (PLEG): container finished" podID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerID="a243c737804d7ac5228fbba62f60616b0aceb57e80070c119efd5e9023735c2e" exitCode=0 Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.320338 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" event={"ID":"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a","Type":"ContainerDied","Data":"a243c737804d7ac5228fbba62f60616b0aceb57e80070c119efd5e9023735c2e"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.320382 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" event={"ID":"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a","Type":"ContainerStarted","Data":"5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.323331 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" event={"ID":"cc21b931-be3c-442f-b886-3f245f88e079","Type":"ContainerStarted","Data":"93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.327684 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerStarted","Data":"dee094ebc32ad29b90aa09dfecbd343d40228ef897e93765112b07dde6439eef"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.332777 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75","Type":"ContainerStarted","Data":"d37eaeaaf8dfd0b02823e76e80c809208681edf731dbd9b5fdc106eaae22eb8a"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.336847 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qc4zz" event={"ID":"1e9b526e-bd2b-4710-9225-669035576d7c","Type":"ContainerStarted","Data":"aac8618970479222f2878823905a1d2d396933c89907b7f32ea6296a3c8232a4"} Oct 09 14:08:28 crc kubenswrapper[4902]: I1009 14:08:28.386582 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qc4zz" podStartSLOduration=3.386559089 podStartE2EDuration="3.386559089s" podCreationTimestamp="2025-10-09 14:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:28.366886851 +0000 UTC m=+1055.564745915" watchObservedRunningTime="2025-10-09 14:08:28.386559089 +0000 UTC m=+1055.584418153" Oct 09 14:08:29 crc kubenswrapper[4902]: I1009 14:08:29.355861 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" event={"ID":"cc21b931-be3c-442f-b886-3f245f88e079","Type":"ContainerStarted","Data":"ab0c57d28e288cef2fb78a7e8287cd91048ac50617bc07e6add91f401c5e73cb"} Oct 09 14:08:29 crc kubenswrapper[4902]: I1009 14:08:29.363383 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" event={"ID":"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a","Type":"ContainerStarted","Data":"25be80322b10ec09e7df7ccfe8b41efe33db5305aea9af40873cfc9725d576a0"} Oct 09 14:08:29 crc kubenswrapper[4902]: I1009 14:08:29.363526 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:29 crc kubenswrapper[4902]: I1009 14:08:29.398922 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" podStartSLOduration=2.398899351 podStartE2EDuration="2.398899351s" podCreationTimestamp="2025-10-09 14:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:29.37031043 +0000 UTC m=+1056.568169524" watchObservedRunningTime="2025-10-09 14:08:29.398899351 +0000 UTC m=+1056.596758455" Oct 09 14:08:29 crc kubenswrapper[4902]: I1009 14:08:29.412178 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" podStartSLOduration=3.41215284 podStartE2EDuration="3.41215284s" podCreationTimestamp="2025-10-09 14:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:29.392142012 +0000 UTC m=+1056.590001086" watchObservedRunningTime="2025-10-09 14:08:29.41215284 +0000 UTC m=+1056.610011894" Oct 09 14:08:30 crc kubenswrapper[4902]: I1009 14:08:30.247798 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:08:30 crc kubenswrapper[4902]: I1009 14:08:30.260185 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.411333 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerStarted","Data":"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.411641 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerStarted","Data":"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.413700 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75","Type":"ContainerStarted","Data":"73223dc891fcb549c3e774588c1807e08b0523ac825c56d90381543ba1bfcdbe"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.413894 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://73223dc891fcb549c3e774588c1807e08b0523ac825c56d90381543ba1bfcdbe" gracePeriod=30 Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.417820 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b42ec27-52a6-459c-a266-7a6dba2acfab","Type":"ContainerStarted","Data":"56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.423900 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerStarted","Data":"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.424096 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerStarted","Data":"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d"} Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.424092 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-log" containerID="cri-o://92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" gracePeriod=30 Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.424201 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-metadata" containerID="cri-o://02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" gracePeriod=30 Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.442847 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.683415514 podStartE2EDuration="6.442829138s" podCreationTimestamp="2025-10-09 14:08:26 +0000 UTC" firstStartedPulling="2025-10-09 14:08:27.614152359 +0000 UTC m=+1054.812011423" lastFinishedPulling="2025-10-09 14:08:31.373565983 +0000 UTC m=+1058.571425047" observedRunningTime="2025-10-09 14:08:32.441298023 +0000 UTC m=+1059.639157117" watchObservedRunningTime="2025-10-09 14:08:32.442829138 +0000 UTC m=+1059.640688202" Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.468358 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.555968998 podStartE2EDuration="6.468328227s" podCreationTimestamp="2025-10-09 14:08:26 +0000 UTC" firstStartedPulling="2025-10-09 14:08:27.489274089 +0000 UTC m=+1054.687133153" lastFinishedPulling="2025-10-09 14:08:31.401633318 +0000 UTC m=+1058.599492382" observedRunningTime="2025-10-09 14:08:32.462051993 +0000 UTC m=+1059.659911067" watchObservedRunningTime="2025-10-09 14:08:32.468328227 +0000 UTC m=+1059.666187291" Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.486627 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.968480662 podStartE2EDuration="6.486607685s" podCreationTimestamp="2025-10-09 14:08:26 +0000 UTC" firstStartedPulling="2025-10-09 14:08:27.856615085 +0000 UTC m=+1055.054474149" lastFinishedPulling="2025-10-09 14:08:31.374742108 +0000 UTC m=+1058.572601172" observedRunningTime="2025-10-09 14:08:32.4796403 +0000 UTC m=+1059.677499394" watchObservedRunningTime="2025-10-09 14:08:32.486607685 +0000 UTC m=+1059.684466749" Oct 09 14:08:32 crc kubenswrapper[4902]: I1009 14:08:32.508970 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.74761635 podStartE2EDuration="6.508947451s" podCreationTimestamp="2025-10-09 14:08:26 +0000 UTC" firstStartedPulling="2025-10-09 14:08:27.61248649 +0000 UTC m=+1054.810345554" lastFinishedPulling="2025-10-09 14:08:31.373817601 +0000 UTC m=+1058.571676655" observedRunningTime="2025-10-09 14:08:32.503087769 +0000 UTC m=+1059.700946863" watchObservedRunningTime="2025-10-09 14:08:32.508947451 +0000 UTC m=+1059.706806515" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.140302 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.322919 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle\") pod \"c04f9588-77ae-4927-b59d-55db8d7764ae\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.323002 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfddw\" (UniqueName: \"kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw\") pod \"c04f9588-77ae-4927-b59d-55db8d7764ae\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.323047 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data\") pod \"c04f9588-77ae-4927-b59d-55db8d7764ae\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.323137 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs\") pod \"c04f9588-77ae-4927-b59d-55db8d7764ae\" (UID: \"c04f9588-77ae-4927-b59d-55db8d7764ae\") " Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.323761 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs" (OuterVolumeSpecName: "logs") pod "c04f9588-77ae-4927-b59d-55db8d7764ae" (UID: "c04f9588-77ae-4927-b59d-55db8d7764ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.324322 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c04f9588-77ae-4927-b59d-55db8d7764ae-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.333653 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw" (OuterVolumeSpecName: "kube-api-access-cfddw") pod "c04f9588-77ae-4927-b59d-55db8d7764ae" (UID: "c04f9588-77ae-4927-b59d-55db8d7764ae"). InnerVolumeSpecName "kube-api-access-cfddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.359023 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data" (OuterVolumeSpecName: "config-data") pod "c04f9588-77ae-4927-b59d-55db8d7764ae" (UID: "c04f9588-77ae-4927-b59d-55db8d7764ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.375685 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c04f9588-77ae-4927-b59d-55db8d7764ae" (UID: "c04f9588-77ae-4927-b59d-55db8d7764ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.425981 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.426047 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfddw\" (UniqueName: \"kubernetes.io/projected/c04f9588-77ae-4927-b59d-55db8d7764ae-kube-api-access-cfddw\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.426065 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04f9588-77ae-4927-b59d-55db8d7764ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437234 4902 generic.go:334] "Generic (PLEG): container finished" podID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerID="02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" exitCode=0 Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437476 4902 generic.go:334] "Generic (PLEG): container finished" podID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerID="92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" exitCode=143 Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437307 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerDied","Data":"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7"} Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437670 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerDied","Data":"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d"} Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437686 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c04f9588-77ae-4927-b59d-55db8d7764ae","Type":"ContainerDied","Data":"3cdf2e728bba544469810c5bf924e1ea4b2df5a0e0f94ba9f5904fd60451e3b3"} Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437703 4902 scope.go:117] "RemoveContainer" containerID="02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.437289 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.471388 4902 scope.go:117] "RemoveContainer" containerID="92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.490852 4902 scope.go:117] "RemoveContainer" containerID="02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" Oct 09 14:08:33 crc kubenswrapper[4902]: E1009 14:08:33.491189 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7\": container with ID starting with 02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7 not found: ID does not exist" containerID="02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491224 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7"} err="failed to get container status \"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7\": rpc error: code = NotFound desc = could not find container \"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7\": container with ID starting with 02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7 not found: ID does not exist" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491246 4902 scope.go:117] "RemoveContainer" containerID="92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" Oct 09 14:08:33 crc kubenswrapper[4902]: E1009 14:08:33.491492 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d\": container with ID starting with 92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d not found: ID does not exist" containerID="92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491518 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d"} err="failed to get container status \"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d\": rpc error: code = NotFound desc = could not find container \"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d\": container with ID starting with 92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d not found: ID does not exist" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491534 4902 scope.go:117] "RemoveContainer" containerID="02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491915 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7"} err="failed to get container status \"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7\": rpc error: code = NotFound desc = could not find container \"02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7\": container with ID starting with 02b624e93d7596fa4a5b94d8cedda4588d76a24e8a5f51b77fbfc5997e4f1ad7 not found: ID does not exist" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.491971 4902 scope.go:117] "RemoveContainer" containerID="92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.492255 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d"} err="failed to get container status \"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d\": rpc error: code = NotFound desc = could not find container \"92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d\": container with ID starting with 92c11974c898844fa85f419c9ad5fa809b6eeb94ea9195f91a464917e182213d not found: ID does not exist" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.495788 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.535262 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.544351 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:33 crc kubenswrapper[4902]: E1009 14:08:33.544914 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-metadata" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.544942 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-metadata" Oct 09 14:08:33 crc kubenswrapper[4902]: E1009 14:08:33.544967 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-log" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.544977 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-log" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.545285 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-log" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.545313 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" containerName="nova-metadata-metadata" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.546658 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.548710 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.548918 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.559970 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.731689 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.731761 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.731861 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.731903 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.732021 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp5j\" (UniqueName: \"kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.833865 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.833950 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.834038 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.834084 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.834150 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp5j\" (UniqueName: \"kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.835060 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.839063 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.846992 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.847538 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.850337 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp5j\" (UniqueName: \"kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j\") pod \"nova-metadata-0\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " pod="openstack/nova-metadata-0" Oct 09 14:08:33 crc kubenswrapper[4902]: I1009 14:08:33.863334 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:34 crc kubenswrapper[4902]: I1009 14:08:34.302180 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:34 crc kubenswrapper[4902]: I1009 14:08:34.450994 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerStarted","Data":"9160abff1063e0aa42233df1cd9a10fd98de3e41272f9a7c9456e9822917563f"} Oct 09 14:08:35 crc kubenswrapper[4902]: I1009 14:08:35.460849 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerStarted","Data":"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4"} Oct 09 14:08:35 crc kubenswrapper[4902]: I1009 14:08:35.461183 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerStarted","Data":"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316"} Oct 09 14:08:35 crc kubenswrapper[4902]: I1009 14:08:35.493170 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.493147414 podStartE2EDuration="2.493147414s" podCreationTimestamp="2025-10-09 14:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:35.486665703 +0000 UTC m=+1062.684524817" watchObservedRunningTime="2025-10-09 14:08:35.493147414 +0000 UTC m=+1062.691006498" Oct 09 14:08:35 crc kubenswrapper[4902]: I1009 14:08:35.524770 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04f9588-77ae-4927-b59d-55db8d7764ae" path="/var/lib/kubelet/pods/c04f9588-77ae-4927-b59d-55db8d7764ae/volumes" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.471879 4902 generic.go:334] "Generic (PLEG): container finished" podID="1e9b526e-bd2b-4710-9225-669035576d7c" containerID="aac8618970479222f2878823905a1d2d396933c89907b7f32ea6296a3c8232a4" exitCode=0 Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.471940 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qc4zz" event={"ID":"1e9b526e-bd2b-4710-9225-669035576d7c","Type":"ContainerDied","Data":"aac8618970479222f2878823905a1d2d396933c89907b7f32ea6296a3c8232a4"} Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.846939 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.848157 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.848230 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.894001 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.894908 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 14:08:36 crc kubenswrapper[4902]: I1009 14:08:36.959432 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.187948 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.259129 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.259364 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="dnsmasq-dns" containerID="cri-o://86c8206a3f8db76cd10e719402b512a670e6bb9d936d05ad3295efba0946ae66" gracePeriod=10 Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.489126 4902 generic.go:334] "Generic (PLEG): container finished" podID="abbbf9c3-be8f-4199-9f68-626d18441730" containerID="86c8206a3f8db76cd10e719402b512a670e6bb9d936d05ad3295efba0946ae66" exitCode=0 Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.489447 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" event={"ID":"abbbf9c3-be8f-4199-9f68-626d18441730","Type":"ContainerDied","Data":"86c8206a3f8db76cd10e719402b512a670e6bb9d936d05ad3295efba0946ae66"} Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.567967 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.879915 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.930625 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:37 crc kubenswrapper[4902]: I1009 14:08:37.930939 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.016551 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.017317 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.018132 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.018230 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.018344 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqg42\" (UniqueName: \"kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.018877 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.018932 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb\") pod \"abbbf9c3-be8f-4199-9f68-626d18441730\" (UID: \"abbbf9c3-be8f-4199-9f68-626d18441730\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.025975 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42" (OuterVolumeSpecName: "kube-api-access-wqg42") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "kube-api-access-wqg42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.088568 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.097584 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config" (OuterVolumeSpecName: "config") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.103032 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.112455 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.114311 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abbbf9c3-be8f-4199-9f68-626d18441730" (UID: "abbbf9c3-be8f-4199-9f68-626d18441730"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.120343 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data\") pod \"1e9b526e-bd2b-4710-9225-669035576d7c\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.120453 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8hkp\" (UniqueName: \"kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp\") pod \"1e9b526e-bd2b-4710-9225-669035576d7c\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.120645 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle\") pod \"1e9b526e-bd2b-4710-9225-669035576d7c\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.120744 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts\") pod \"1e9b526e-bd2b-4710-9225-669035576d7c\" (UID: \"1e9b526e-bd2b-4710-9225-669035576d7c\") " Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121152 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121172 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121183 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121193 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121201 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abbbf9c3-be8f-4199-9f68-626d18441730-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.121210 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqg42\" (UniqueName: \"kubernetes.io/projected/abbbf9c3-be8f-4199-9f68-626d18441730-kube-api-access-wqg42\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.124629 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts" (OuterVolumeSpecName: "scripts") pod "1e9b526e-bd2b-4710-9225-669035576d7c" (UID: "1e9b526e-bd2b-4710-9225-669035576d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.128397 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp" (OuterVolumeSpecName: "kube-api-access-z8hkp") pod "1e9b526e-bd2b-4710-9225-669035576d7c" (UID: "1e9b526e-bd2b-4710-9225-669035576d7c"). InnerVolumeSpecName "kube-api-access-z8hkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.158592 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data" (OuterVolumeSpecName: "config-data") pod "1e9b526e-bd2b-4710-9225-669035576d7c" (UID: "1e9b526e-bd2b-4710-9225-669035576d7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.175925 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e9b526e-bd2b-4710-9225-669035576d7c" (UID: "1e9b526e-bd2b-4710-9225-669035576d7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.222479 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.222520 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.222533 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e9b526e-bd2b-4710-9225-669035576d7c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.222547 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8hkp\" (UniqueName: \"kubernetes.io/projected/1e9b526e-bd2b-4710-9225-669035576d7c-kube-api-access-z8hkp\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.500324 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" event={"ID":"abbbf9c3-be8f-4199-9f68-626d18441730","Type":"ContainerDied","Data":"5cae291e575774553e628f5ac8f9b9cd8b9664b8661e54dbf41487e0f878874a"} Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.500382 4902 scope.go:117] "RemoveContainer" containerID="86c8206a3f8db76cd10e719402b512a670e6bb9d936d05ad3295efba0946ae66" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.501563 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-s2w6w" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.502311 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qc4zz" event={"ID":"1e9b526e-bd2b-4710-9225-669035576d7c","Type":"ContainerDied","Data":"197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68"} Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.502330 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="197695de3acd8e4cc832ddfff4c6580e1170dee810303ed25ca8ac20bdf88d68" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.502377 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qc4zz" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.519382 4902 generic.go:334] "Generic (PLEG): container finished" podID="cc21b931-be3c-442f-b886-3f245f88e079" containerID="ab0c57d28e288cef2fb78a7e8287cd91048ac50617bc07e6add91f401c5e73cb" exitCode=0 Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.519955 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" event={"ID":"cc21b931-be3c-442f-b886-3f245f88e079","Type":"ContainerDied","Data":"ab0c57d28e288cef2fb78a7e8287cd91048ac50617bc07e6add91f401c5e73cb"} Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.541599 4902 scope.go:117] "RemoveContainer" containerID="68aa74132c2dde6f0a9ce6a6a8259747f1959feff16647c4238cd18b4a621c2c" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.597523 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.607268 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-s2w6w"] Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.654250 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.654690 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-log" containerID="cri-o://85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb" gracePeriod=30 Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.655140 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-api" containerID="cri-o://870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e" gracePeriod=30 Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.675957 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.697013 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.697493 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-log" containerID="cri-o://2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" gracePeriod=30 Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.698106 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-metadata" containerID="cri-o://5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" gracePeriod=30 Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.863639 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:08:38 crc kubenswrapper[4902]: I1009 14:08:38.863790 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.249686 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.348181 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle\") pod \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.348248 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp5j\" (UniqueName: \"kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j\") pod \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.348283 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs\") pod \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.348324 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data\") pod \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.348344 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs\") pod \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\" (UID: \"b3459d33-1059-42ea-b9d2-fc268a58c7cf\") " Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.349249 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs" (OuterVolumeSpecName: "logs") pod "b3459d33-1059-42ea-b9d2-fc268a58c7cf" (UID: "b3459d33-1059-42ea-b9d2-fc268a58c7cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.356129 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j" (OuterVolumeSpecName: "kube-api-access-gwp5j") pod "b3459d33-1059-42ea-b9d2-fc268a58c7cf" (UID: "b3459d33-1059-42ea-b9d2-fc268a58c7cf"). InnerVolumeSpecName "kube-api-access-gwp5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.395998 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data" (OuterVolumeSpecName: "config-data") pod "b3459d33-1059-42ea-b9d2-fc268a58c7cf" (UID: "b3459d33-1059-42ea-b9d2-fc268a58c7cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.401138 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3459d33-1059-42ea-b9d2-fc268a58c7cf" (UID: "b3459d33-1059-42ea-b9d2-fc268a58c7cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.407290 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b3459d33-1059-42ea-b9d2-fc268a58c7cf" (UID: "b3459d33-1059-42ea-b9d2-fc268a58c7cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.450262 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.450298 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwp5j\" (UniqueName: \"kubernetes.io/projected/b3459d33-1059-42ea-b9d2-fc268a58c7cf-kube-api-access-gwp5j\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.450307 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3459d33-1059-42ea-b9d2-fc268a58c7cf-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.450316 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.450324 4902 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3459d33-1059-42ea-b9d2-fc268a58c7cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.525250 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" path="/var/lib/kubelet/pods/abbbf9c3-be8f-4199-9f68-626d18441730/volumes" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531156 4902 generic.go:334] "Generic (PLEG): container finished" podID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerID="5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" exitCode=0 Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531194 4902 generic.go:334] "Generic (PLEG): container finished" podID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerID="2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" exitCode=143 Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531223 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531246 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerDied","Data":"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4"} Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531274 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerDied","Data":"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316"} Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531287 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b3459d33-1059-42ea-b9d2-fc268a58c7cf","Type":"ContainerDied","Data":"9160abff1063e0aa42233df1cd9a10fd98de3e41272f9a7c9456e9822917563f"} Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.531305 4902 scope.go:117] "RemoveContainer" containerID="5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.533537 4902 generic.go:334] "Generic (PLEG): container finished" podID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerID="85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb" exitCode=143 Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.533619 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerDied","Data":"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb"} Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.572242 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.575265 4902 scope.go:117] "RemoveContainer" containerID="2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.607047 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.616578 4902 scope.go:117] "RemoveContainer" containerID="5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.620512 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4\": container with ID starting with 5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4 not found: ID does not exist" containerID="5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.620551 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4"} err="failed to get container status \"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4\": rpc error: code = NotFound desc = could not find container \"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4\": container with ID starting with 5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4 not found: ID does not exist" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.620574 4902 scope.go:117] "RemoveContainer" containerID="2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.624506 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316\": container with ID starting with 2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316 not found: ID does not exist" containerID="2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.624539 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316"} err="failed to get container status \"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316\": rpc error: code = NotFound desc = could not find container \"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316\": container with ID starting with 2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316 not found: ID does not exist" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.624561 4902 scope.go:117] "RemoveContainer" containerID="5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.627518 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4"} err="failed to get container status \"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4\": rpc error: code = NotFound desc = could not find container \"5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4\": container with ID starting with 5cbf066b375c2664054f3769fa00750340adf4f0cb1c5d7effcd635352a62ef4 not found: ID does not exist" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.627544 4902 scope.go:117] "RemoveContainer" containerID="2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.631107 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316"} err="failed to get container status \"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316\": rpc error: code = NotFound desc = could not find container \"2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316\": container with ID starting with 2d5c861f8da5b070c0e4b3d6c335c3aba68891aa4a007f0da761b62b7e0a3316 not found: ID does not exist" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633245 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.633755 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="dnsmasq-dns" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633778 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="dnsmasq-dns" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.633795 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="init" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633803 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="init" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.633826 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-log" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633834 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-log" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.633846 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9b526e-bd2b-4710-9225-669035576d7c" containerName="nova-manage" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633853 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9b526e-bd2b-4710-9225-669035576d7c" containerName="nova-manage" Oct 09 14:08:39 crc kubenswrapper[4902]: E1009 14:08:39.633884 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-metadata" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.633892 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-metadata" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.634137 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-log" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.634159 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" containerName="nova-metadata-metadata" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.634176 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="abbbf9c3-be8f-4199-9f68-626d18441730" containerName="dnsmasq-dns" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.634195 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9b526e-bd2b-4710-9225-669035576d7c" containerName="nova-manage" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.635436 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.639793 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.640070 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.644344 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.758912 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.758983 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.759020 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.759155 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxxz\" (UniqueName: \"kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.759217 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.881573 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxxz\" (UniqueName: \"kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.881707 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.881761 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.881789 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.881819 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.890311 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.912220 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.921182 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.921696 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.948010 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxxz\" (UniqueName: \"kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz\") pod \"nova-metadata-0\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " pod="openstack/nova-metadata-0" Oct 09 14:08:39 crc kubenswrapper[4902]: I1009 14:08:39.972836 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.044460 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.194314 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbcfn\" (UniqueName: \"kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn\") pod \"cc21b931-be3c-442f-b886-3f245f88e079\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.194598 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle\") pod \"cc21b931-be3c-442f-b886-3f245f88e079\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.194627 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts\") pod \"cc21b931-be3c-442f-b886-3f245f88e079\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.194799 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data\") pod \"cc21b931-be3c-442f-b886-3f245f88e079\" (UID: \"cc21b931-be3c-442f-b886-3f245f88e079\") " Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.198930 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts" (OuterVolumeSpecName: "scripts") pod "cc21b931-be3c-442f-b886-3f245f88e079" (UID: "cc21b931-be3c-442f-b886-3f245f88e079"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.199770 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn" (OuterVolumeSpecName: "kube-api-access-bbcfn") pod "cc21b931-be3c-442f-b886-3f245f88e079" (UID: "cc21b931-be3c-442f-b886-3f245f88e079"). InnerVolumeSpecName "kube-api-access-bbcfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.221656 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc21b931-be3c-442f-b886-3f245f88e079" (UID: "cc21b931-be3c-442f-b886-3f245f88e079"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.227917 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data" (OuterVolumeSpecName: "config-data") pod "cc21b931-be3c-442f-b886-3f245f88e079" (UID: "cc21b931-be3c-442f-b886-3f245f88e079"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.297613 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.297650 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbcfn\" (UniqueName: \"kubernetes.io/projected/cc21b931-be3c-442f-b886-3f245f88e079-kube-api-access-bbcfn\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.297659 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.297669 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21b931-be3c-442f-b886-3f245f88e079-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.435029 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.549129 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerStarted","Data":"d5e6934f25710c9526192e44e4d1293f9f3a281032e46db8d17a7818815e35b9"} Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.556288 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerName="nova-scheduler-scheduler" containerID="cri-o://56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" gracePeriod=30 Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.556591 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.560531 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ssjrs" event={"ID":"cc21b931-be3c-442f-b886-3f245f88e079","Type":"ContainerDied","Data":"93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669"} Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.560710 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e2d9accdaf257fc7106dc11f01a138659844fcf518d17d7402487982879669" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.620373 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 14:08:40 crc kubenswrapper[4902]: E1009 14:08:40.620807 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc21b931-be3c-442f-b886-3f245f88e079" containerName="nova-cell1-conductor-db-sync" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.620827 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc21b931-be3c-442f-b886-3f245f88e079" containerName="nova-cell1-conductor-db-sync" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.620987 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc21b931-be3c-442f-b886-3f245f88e079" containerName="nova-cell1-conductor-db-sync" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.621782 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.624000 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.637699 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.703053 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.703187 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn8z6\" (UniqueName: \"kubernetes.io/projected/3fe652a7-7ee4-4a55-9f17-a359b82df106-kube-api-access-bn8z6\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.703212 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.804970 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn8z6\" (UniqueName: \"kubernetes.io/projected/3fe652a7-7ee4-4a55-9f17-a359b82df106-kube-api-access-bn8z6\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.805040 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.805137 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.809516 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.809669 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe652a7-7ee4-4a55-9f17-a359b82df106-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.823684 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn8z6\" (UniqueName: \"kubernetes.io/projected/3fe652a7-7ee4-4a55-9f17-a359b82df106-kube-api-access-bn8z6\") pod \"nova-cell1-conductor-0\" (UID: \"3fe652a7-7ee4-4a55-9f17-a359b82df106\") " pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:40 crc kubenswrapper[4902]: I1009 14:08:40.944915 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.402517 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 14:08:41 crc kubenswrapper[4902]: W1009 14:08:41.410201 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fe652a7_7ee4_4a55_9f17_a359b82df106.slice/crio-420ffa9e2f5a75e63784512c4fe3002e4bcc338d46f25a4ee1d7858608c5c667 WatchSource:0}: Error finding container 420ffa9e2f5a75e63784512c4fe3002e4bcc338d46f25a4ee1d7858608c5c667: Status 404 returned error can't find the container with id 420ffa9e2f5a75e63784512c4fe3002e4bcc338d46f25a4ee1d7858608c5c667 Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.523434 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3459d33-1059-42ea-b9d2-fc268a58c7cf" path="/var/lib/kubelet/pods/b3459d33-1059-42ea-b9d2-fc268a58c7cf/volumes" Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.568103 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3fe652a7-7ee4-4a55-9f17-a359b82df106","Type":"ContainerStarted","Data":"420ffa9e2f5a75e63784512c4fe3002e4bcc338d46f25a4ee1d7858608c5c667"} Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.570782 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerStarted","Data":"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93"} Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.570808 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerStarted","Data":"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add"} Oct 09 14:08:41 crc kubenswrapper[4902]: I1009 14:08:41.602665 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.602638026 podStartE2EDuration="2.602638026s" podCreationTimestamp="2025-10-09 14:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:41.59903292 +0000 UTC m=+1068.796892014" watchObservedRunningTime="2025-10-09 14:08:41.602638026 +0000 UTC m=+1068.800497100" Oct 09 14:08:41 crc kubenswrapper[4902]: E1009 14:08:41.898253 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 14:08:41 crc kubenswrapper[4902]: E1009 14:08:41.900511 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 14:08:41 crc kubenswrapper[4902]: E1009 14:08:41.902146 4902 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 14:08:41 crc kubenswrapper[4902]: E1009 14:08:41.902181 4902 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerName="nova-scheduler-scheduler" Oct 09 14:08:42 crc kubenswrapper[4902]: I1009 14:08:42.586401 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3fe652a7-7ee4-4a55-9f17-a359b82df106","Type":"ContainerStarted","Data":"b8645c4ec4737efa877ba783e8981bf8c7ee0c431834bb74dc93eb006279379a"} Oct 09 14:08:42 crc kubenswrapper[4902]: I1009 14:08:42.586967 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:42 crc kubenswrapper[4902]: I1009 14:08:42.612804 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.612784393 podStartE2EDuration="2.612784393s" podCreationTimestamp="2025-10-09 14:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:42.606063726 +0000 UTC m=+1069.803922800" watchObservedRunningTime="2025-10-09 14:08:42.612784393 +0000 UTC m=+1069.810643467" Oct 09 14:08:43 crc kubenswrapper[4902]: I1009 14:08:43.593848 4902 generic.go:334] "Generic (PLEG): container finished" podID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerID="56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" exitCode=0 Oct 09 14:08:43 crc kubenswrapper[4902]: I1009 14:08:43.594587 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b42ec27-52a6-459c-a266-7a6dba2acfab","Type":"ContainerDied","Data":"56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62"} Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.364664 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.491931 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data\") pod \"8b42ec27-52a6-459c-a266-7a6dba2acfab\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.491984 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle\") pod \"8b42ec27-52a6-459c-a266-7a6dba2acfab\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.492068 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2d79\" (UniqueName: \"kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79\") pod \"8b42ec27-52a6-459c-a266-7a6dba2acfab\" (UID: \"8b42ec27-52a6-459c-a266-7a6dba2acfab\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.502487 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79" (OuterVolumeSpecName: "kube-api-access-b2d79") pod "8b42ec27-52a6-459c-a266-7a6dba2acfab" (UID: "8b42ec27-52a6-459c-a266-7a6dba2acfab"). InnerVolumeSpecName "kube-api-access-b2d79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.530937 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data" (OuterVolumeSpecName: "config-data") pod "8b42ec27-52a6-459c-a266-7a6dba2acfab" (UID: "8b42ec27-52a6-459c-a266-7a6dba2acfab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.532733 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b42ec27-52a6-459c-a266-7a6dba2acfab" (UID: "8b42ec27-52a6-459c-a266-7a6dba2acfab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.594002 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2d79\" (UniqueName: \"kubernetes.io/projected/8b42ec27-52a6-459c-a266-7a6dba2acfab-kube-api-access-b2d79\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.595047 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.595146 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42ec27-52a6-459c-a266-7a6dba2acfab-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.605215 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.616894 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b42ec27-52a6-459c-a266-7a6dba2acfab","Type":"ContainerDied","Data":"fef3c1693a25564277ec5da1f63d050c9b6b7d4265f389a6cc55821d21bf8901"} Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.617363 4902 scope.go:117] "RemoveContainer" containerID="56dd803ec4adc0c384f83e0033d424c96bbaf99dfb40425f5af56e635b40bc62" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.617590 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.626924 4902 generic.go:334] "Generic (PLEG): container finished" podID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerID="870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e" exitCode=0 Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.627185 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerDied","Data":"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e"} Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.627304 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27","Type":"ContainerDied","Data":"dee094ebc32ad29b90aa09dfecbd343d40228ef897e93765112b07dde6439eef"} Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.627440 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.662284 4902 scope.go:117] "RemoveContainer" containerID="870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.682904 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.696009 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle\") pod \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.696149 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwh7j\" (UniqueName: \"kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j\") pod \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.696301 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data\") pod \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.696348 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs\") pod \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\" (UID: \"c2cd73e9-ca3d-4916-b5f5-06e4599ebe27\") " Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.697261 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs" (OuterVolumeSpecName: "logs") pod "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" (UID: "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.705314 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j" (OuterVolumeSpecName: "kube-api-access-jwh7j") pod "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" (UID: "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27"). InnerVolumeSpecName "kube-api-access-jwh7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.707963 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.723937 4902 scope.go:117] "RemoveContainer" containerID="85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.741572 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: E1009 14:08:44.742940 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-log" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.742987 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-log" Oct 09 14:08:44 crc kubenswrapper[4902]: E1009 14:08:44.743038 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerName="nova-scheduler-scheduler" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.743044 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerName="nova-scheduler-scheduler" Oct 09 14:08:44 crc kubenswrapper[4902]: E1009 14:08:44.743063 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-api" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.743070 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-api" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.743454 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-api" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.743487 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" containerName="nova-scheduler-scheduler" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.743501 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" containerName="nova-api-log" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.746380 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.751573 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.757931 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.760954 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" (UID: "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.772619 4902 scope.go:117] "RemoveContainer" containerID="870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e" Oct 09 14:08:44 crc kubenswrapper[4902]: E1009 14:08:44.774641 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e\": container with ID starting with 870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e not found: ID does not exist" containerID="870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.774695 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data" (OuterVolumeSpecName: "config-data") pod "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" (UID: "c2cd73e9-ca3d-4916-b5f5-06e4599ebe27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.774696 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e"} err="failed to get container status \"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e\": rpc error: code = NotFound desc = could not find container \"870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e\": container with ID starting with 870853645e5414b62c8f3f24ea5bff10fbbf90cdbf43eb25db723578c4bc4c2e not found: ID does not exist" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.774751 4902 scope.go:117] "RemoveContainer" containerID="85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb" Oct 09 14:08:44 crc kubenswrapper[4902]: E1009 14:08:44.777594 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb\": container with ID starting with 85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb not found: ID does not exist" containerID="85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.777643 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb"} err="failed to get container status \"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb\": rpc error: code = NotFound desc = could not find container \"85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb\": container with ID starting with 85f2795ee28fe87103b924cc9ae2b4f2f844306074987dc7fdd1bc7cfcd3e3cb not found: ID does not exist" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.798599 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.798811 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.798889 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwh7j\" (UniqueName: \"kubernetes.io/projected/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-kube-api-access-jwh7j\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.798998 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.900789 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lss\" (UniqueName: \"kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.900873 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.900986 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.965947 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.973913 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.974222 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.978451 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.988591 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.990399 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.992654 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 14:08:44 crc kubenswrapper[4902]: I1009 14:08:44.998227 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.003034 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.003127 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lss\" (UniqueName: \"kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.003224 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.008299 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.008998 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.030906 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lss\" (UniqueName: \"kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss\") pod \"nova-scheduler-0\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.105307 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.105363 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.105718 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phq47\" (UniqueName: \"kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.105788 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.139080 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.207903 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.208025 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.208052 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.208166 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phq47\" (UniqueName: \"kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.209079 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.213702 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.216212 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.229697 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phq47\" (UniqueName: \"kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47\") pod \"nova-api-0\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.312463 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.525841 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b42ec27-52a6-459c-a266-7a6dba2acfab" path="/var/lib/kubelet/pods/8b42ec27-52a6-459c-a266-7a6dba2acfab/volumes" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.526477 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cd73e9-ca3d-4916-b5f5-06e4599ebe27" path="/var/lib/kubelet/pods/c2cd73e9-ca3d-4916-b5f5-06e4599ebe27/volumes" Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.619313 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:08:45 crc kubenswrapper[4902]: W1009 14:08:45.619355 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9050747_0e34_4263_b886_00511d489a2f.slice/crio-406b5695518556020343fa3a94e2b90eeaf54136fd2d4960dcd16ba18773a05e WatchSource:0}: Error finding container 406b5695518556020343fa3a94e2b90eeaf54136fd2d4960dcd16ba18773a05e: Status 404 returned error can't find the container with id 406b5695518556020343fa3a94e2b90eeaf54136fd2d4960dcd16ba18773a05e Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.641593 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9050747-0e34-4263-b886-00511d489a2f","Type":"ContainerStarted","Data":"406b5695518556020343fa3a94e2b90eeaf54136fd2d4960dcd16ba18773a05e"} Oct 09 14:08:45 crc kubenswrapper[4902]: I1009 14:08:45.797457 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:08:45 crc kubenswrapper[4902]: W1009 14:08:45.805019 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef02f023_650b_4ffb_8781_55f905a5490d.slice/crio-d4f82d784ad8f3faa4e4c3a2129ac9821db69d3ffc3e7dcb91f2d217e7e48453 WatchSource:0}: Error finding container d4f82d784ad8f3faa4e4c3a2129ac9821db69d3ffc3e7dcb91f2d217e7e48453: Status 404 returned error can't find the container with id d4f82d784ad8f3faa4e4c3a2129ac9821db69d3ffc3e7dcb91f2d217e7e48453 Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.657437 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerStarted","Data":"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9"} Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.660568 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerStarted","Data":"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07"} Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.660611 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerStarted","Data":"d4f82d784ad8f3faa4e4c3a2129ac9821db69d3ffc3e7dcb91f2d217e7e48453"} Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.664946 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9050747-0e34-4263-b886-00511d489a2f","Type":"ContainerStarted","Data":"412c4a22357a0ca07053ab5fc3389de18a622ee434f52be6354c10e0beb996fc"} Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.680061 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.680018355 podStartE2EDuration="2.680018355s" podCreationTimestamp="2025-10-09 14:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:46.678570133 +0000 UTC m=+1073.876429237" watchObservedRunningTime="2025-10-09 14:08:46.680018355 +0000 UTC m=+1073.877877419" Oct 09 14:08:46 crc kubenswrapper[4902]: I1009 14:08:46.698673 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.698652933 podStartE2EDuration="2.698652933s" podCreationTimestamp="2025-10-09 14:08:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:08:46.698142858 +0000 UTC m=+1073.896001942" watchObservedRunningTime="2025-10-09 14:08:46.698652933 +0000 UTC m=+1073.896512017" Oct 09 14:08:49 crc kubenswrapper[4902]: I1009 14:08:49.974359 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 14:08:49 crc kubenswrapper[4902]: I1009 14:08:49.976173 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 14:08:50 crc kubenswrapper[4902]: I1009 14:08:50.140141 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 14:08:50 crc kubenswrapper[4902]: I1009 14:08:50.537039 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 14:08:50 crc kubenswrapper[4902]: I1009 14:08:50.973667 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 09 14:08:50 crc kubenswrapper[4902]: I1009 14:08:50.996152 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:50 crc kubenswrapper[4902]: I1009 14:08:50.996832 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:55 crc kubenswrapper[4902]: I1009 14:08:55.140444 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 14:08:55 crc kubenswrapper[4902]: I1009 14:08:55.173372 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 14:08:55 crc kubenswrapper[4902]: I1009 14:08:55.313287 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:08:55 crc kubenswrapper[4902]: I1009 14:08:55.313343 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:08:55 crc kubenswrapper[4902]: I1009 14:08:55.779762 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 14:08:56 crc kubenswrapper[4902]: I1009 14:08:56.396638 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:56 crc kubenswrapper[4902]: I1009 14:08:56.396648 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 14:08:59 crc kubenswrapper[4902]: I1009 14:08:59.979047 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 14:08:59 crc kubenswrapper[4902]: I1009 14:08:59.986396 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 14:08:59 crc kubenswrapper[4902]: I1009 14:08:59.990097 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 14:09:00 crc kubenswrapper[4902]: I1009 14:09:00.801164 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.810820 4902 generic.go:334] "Generic (PLEG): container finished" podID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" containerID="73223dc891fcb549c3e774588c1807e08b0523ac825c56d90381543ba1bfcdbe" exitCode=137 Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.810919 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75","Type":"ContainerDied","Data":"73223dc891fcb549c3e774588c1807e08b0523ac825c56d90381543ba1bfcdbe"} Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.811366 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75","Type":"ContainerDied","Data":"d37eaeaaf8dfd0b02823e76e80c809208681edf731dbd9b5fdc106eaae22eb8a"} Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.811397 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37eaeaaf8dfd0b02823e76e80c809208681edf731dbd9b5fdc106eaae22eb8a" Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.842076 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.970466 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data\") pod \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.971014 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9q7\" (UniqueName: \"kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7\") pod \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.971189 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle\") pod \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\" (UID: \"e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75\") " Oct 09 14:09:02 crc kubenswrapper[4902]: I1009 14:09:02.978061 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7" (OuterVolumeSpecName: "kube-api-access-nc9q7") pod "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" (UID: "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75"). InnerVolumeSpecName "kube-api-access-nc9q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.000239 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" (UID: "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.000370 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data" (OuterVolumeSpecName: "config-data") pod "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" (UID: "e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.074459 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.074521 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9q7\" (UniqueName: \"kubernetes.io/projected/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-kube-api-access-nc9q7\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.074545 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.819541 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.844498 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.868868 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.880098 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:09:03 crc kubenswrapper[4902]: E1009 14:09:03.880598 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.880620 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.880818 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.881525 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.884608 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.884796 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.892159 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.892426 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.991044 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.991154 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.991202 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.991261 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:03 crc kubenswrapper[4902]: I1009 14:09:03.991301 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mntv\" (UniqueName: \"kubernetes.io/projected/2bc33878-b734-4520-b5e0-e066f53dbe31-kube-api-access-7mntv\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.092453 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.092548 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.092589 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.092647 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.092685 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mntv\" (UniqueName: \"kubernetes.io/projected/2bc33878-b734-4520-b5e0-e066f53dbe31-kube-api-access-7mntv\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.098568 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.098829 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.099495 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.107334 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc33878-b734-4520-b5e0-e066f53dbe31-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.110861 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mntv\" (UniqueName: \"kubernetes.io/projected/2bc33878-b734-4520-b5e0-e066f53dbe31-kube-api-access-7mntv\") pod \"nova-cell1-novncproxy-0\" (UID: \"2bc33878-b734-4520-b5e0-e066f53dbe31\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.204649 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.614750 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 14:09:04 crc kubenswrapper[4902]: W1009 14:09:04.623306 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc33878_b734_4520_b5e0_e066f53dbe31.slice/crio-1187fbd4d6eebb635f257e555c6c659863c3159cd04df6f71e46d32035a31c05 WatchSource:0}: Error finding container 1187fbd4d6eebb635f257e555c6c659863c3159cd04df6f71e46d32035a31c05: Status 404 returned error can't find the container with id 1187fbd4d6eebb635f257e555c6c659863c3159cd04df6f71e46d32035a31c05 Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.828280 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bc33878-b734-4520-b5e0-e066f53dbe31","Type":"ContainerStarted","Data":"1d6020d208b8a8cbbe38fea89d565d64bf288f0f1398bbd67e96d82a2b60ef40"} Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.828616 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2bc33878-b734-4520-b5e0-e066f53dbe31","Type":"ContainerStarted","Data":"1187fbd4d6eebb635f257e555c6c659863c3159cd04df6f71e46d32035a31c05"} Oct 09 14:09:04 crc kubenswrapper[4902]: I1009 14:09:04.847592 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.84754978 podStartE2EDuration="1.84754978s" podCreationTimestamp="2025-10-09 14:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:04.84412141 +0000 UTC m=+1092.041980484" watchObservedRunningTime="2025-10-09 14:09:04.84754978 +0000 UTC m=+1092.045408884" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.317669 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.319717 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.320156 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.334682 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.526846 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75" path="/var/lib/kubelet/pods/e5860efa-fa25-4ab5-92e0-c6bb5a9b1e75/volumes" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.837902 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 14:09:05 crc kubenswrapper[4902]: I1009 14:09:05.841355 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.009018 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.018165 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.029063 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.140648 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.140705 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.140739 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.140884 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.141085 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d92ls\" (UniqueName: \"kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.141147 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.242920 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.243323 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.243375 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.243429 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.243505 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.243628 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d92ls\" (UniqueName: \"kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.244029 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.244721 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.244790 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.245389 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.245499 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.263016 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d92ls\" (UniqueName: \"kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls\") pod \"dnsmasq-dns-59cf4bdb65-985vj\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.366463 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:06 crc kubenswrapper[4902]: I1009 14:09:06.840099 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.855071 4902 generic.go:334] "Generic (PLEG): container finished" podID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerID="e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76" exitCode=0 Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.855167 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" event={"ID":"7e21b8a0-0cbd-4672-9aab-e15ed5b44309","Type":"ContainerDied","Data":"e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76"} Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.855589 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" event={"ID":"7e21b8a0-0cbd-4672-9aab-e15ed5b44309","Type":"ContainerStarted","Data":"2d931e4e9e2a3cd7d154857e10ab610015dc3ca7065c43b559d6526cc0118af8"} Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.956856 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.957346 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-central-agent" containerID="cri-o://c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4" gracePeriod=30 Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.957769 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="proxy-httpd" containerID="cri-o://f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83" gracePeriod=30 Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.957827 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="sg-core" containerID="cri-o://b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0" gracePeriod=30 Oct 09 14:09:07 crc kubenswrapper[4902]: I1009 14:09:07.957873 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-notification-agent" containerID="cri-o://1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855" gracePeriod=30 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.400357 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866392 4902 generic.go:334] "Generic (PLEG): container finished" podID="01e06434-8159-49be-bef4-3b0646427c20" containerID="f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83" exitCode=0 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866440 4902 generic.go:334] "Generic (PLEG): container finished" podID="01e06434-8159-49be-bef4-3b0646427c20" containerID="b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0" exitCode=2 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866448 4902 generic.go:334] "Generic (PLEG): container finished" podID="01e06434-8159-49be-bef4-3b0646427c20" containerID="c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4" exitCode=0 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866487 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerDied","Data":"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83"} Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866514 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerDied","Data":"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0"} Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.866523 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerDied","Data":"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4"} Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.868621 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-log" containerID="cri-o://4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07" gracePeriod=30 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.868830 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" event={"ID":"7e21b8a0-0cbd-4672-9aab-e15ed5b44309","Type":"ContainerStarted","Data":"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46"} Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.868921 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-api" containerID="cri-o://b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9" gracePeriod=30 Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.869000 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:08 crc kubenswrapper[4902]: I1009 14:09:08.902757 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" podStartSLOduration=3.902737728 podStartE2EDuration="3.902737728s" podCreationTimestamp="2025-10-09 14:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:08.901147782 +0000 UTC m=+1096.099006856" watchObservedRunningTime="2025-10-09 14:09:08.902737728 +0000 UTC m=+1096.100596792" Oct 09 14:09:09 crc kubenswrapper[4902]: I1009 14:09:09.204951 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:09 crc kubenswrapper[4902]: I1009 14:09:09.878269 4902 generic.go:334] "Generic (PLEG): container finished" podID="ef02f023-650b-4ffb-8781-55f905a5490d" containerID="4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07" exitCode=143 Oct 09 14:09:09 crc kubenswrapper[4902]: I1009 14:09:09.878350 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerDied","Data":"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07"} Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.485696 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.568873 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.570270 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle\") pod \"ef02f023-650b-4ffb-8781-55f905a5490d\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.570401 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs\") pod \"ef02f023-650b-4ffb-8781-55f905a5490d\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.570585 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data\") pod \"ef02f023-650b-4ffb-8781-55f905a5490d\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.570675 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phq47\" (UniqueName: \"kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47\") pod \"ef02f023-650b-4ffb-8781-55f905a5490d\" (UID: \"ef02f023-650b-4ffb-8781-55f905a5490d\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.575661 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs" (OuterVolumeSpecName: "logs") pod "ef02f023-650b-4ffb-8781-55f905a5490d" (UID: "ef02f023-650b-4ffb-8781-55f905a5490d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.580804 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47" (OuterVolumeSpecName: "kube-api-access-phq47") pod "ef02f023-650b-4ffb-8781-55f905a5490d" (UID: "ef02f023-650b-4ffb-8781-55f905a5490d"). InnerVolumeSpecName "kube-api-access-phq47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.604652 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data" (OuterVolumeSpecName: "config-data") pod "ef02f023-650b-4ffb-8781-55f905a5490d" (UID: "ef02f023-650b-4ffb-8781-55f905a5490d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.615534 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef02f023-650b-4ffb-8781-55f905a5490d" (UID: "ef02f023-650b-4ffb-8781-55f905a5490d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672133 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672236 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d94kl\" (UniqueName: \"kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672262 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672280 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672301 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672367 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672458 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672500 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs\") pod \"01e06434-8159-49be-bef4-3b0646427c20\" (UID: \"01e06434-8159-49be-bef4-3b0646427c20\") " Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672884 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672897 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phq47\" (UniqueName: \"kubernetes.io/projected/ef02f023-650b-4ffb-8781-55f905a5490d-kube-api-access-phq47\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672911 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef02f023-650b-4ffb-8781-55f905a5490d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.672922 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef02f023-650b-4ffb-8781-55f905a5490d-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.673362 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.673861 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.677826 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl" (OuterVolumeSpecName: "kube-api-access-d94kl") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "kube-api-access-d94kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.679422 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts" (OuterVolumeSpecName: "scripts") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.714775 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.738771 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.751631 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774319 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d94kl\" (UniqueName: \"kubernetes.io/projected/01e06434-8159-49be-bef4-3b0646427c20-kube-api-access-d94kl\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774354 4902 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774365 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774373 4902 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01e06434-8159-49be-bef4-3b0646427c20-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774384 4902 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774418 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.774426 4902 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.775873 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data" (OuterVolumeSpecName: "config-data") pod "01e06434-8159-49be-bef4-3b0646427c20" (UID: "01e06434-8159-49be-bef4-3b0646427c20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.876398 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e06434-8159-49be-bef4-3b0646427c20-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.907327 4902 generic.go:334] "Generic (PLEG): container finished" podID="ef02f023-650b-4ffb-8781-55f905a5490d" containerID="b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9" exitCode=0 Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.907432 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.907454 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerDied","Data":"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9"} Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.907487 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef02f023-650b-4ffb-8781-55f905a5490d","Type":"ContainerDied","Data":"d4f82d784ad8f3faa4e4c3a2129ac9821db69d3ffc3e7dcb91f2d217e7e48453"} Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.907510 4902 scope.go:117] "RemoveContainer" containerID="b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.913621 4902 generic.go:334] "Generic (PLEG): container finished" podID="01e06434-8159-49be-bef4-3b0646427c20" containerID="1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855" exitCode=0 Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.913699 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.913704 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerDied","Data":"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855"} Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.913759 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01e06434-8159-49be-bef4-3b0646427c20","Type":"ContainerDied","Data":"3f27778818b73ebfd7cc3d19b81a9098120a035308bb1bcdfef4b9fbb41bfce7"} Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.939339 4902 scope.go:117] "RemoveContainer" containerID="4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.942072 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.953536 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.961611 4902 scope.go:117] "RemoveContainer" containerID="b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9" Oct 09 14:09:12 crc kubenswrapper[4902]: E1009 14:09:12.962860 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9\": container with ID starting with b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9 not found: ID does not exist" containerID="b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.962912 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9"} err="failed to get container status \"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9\": rpc error: code = NotFound desc = could not find container \"b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9\": container with ID starting with b1bb60ebf3a74e4af5bbe1697823b15d222fbf1323c59e1cd8d9e13de57b9fd9 not found: ID does not exist" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.962948 4902 scope.go:117] "RemoveContainer" containerID="4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07" Oct 09 14:09:12 crc kubenswrapper[4902]: E1009 14:09:12.963251 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07\": container with ID starting with 4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07 not found: ID does not exist" containerID="4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.963281 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07"} err="failed to get container status \"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07\": rpc error: code = NotFound desc = could not find container \"4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07\": container with ID starting with 4d6710ae2c1e39b844bf2514284a07463284b21209befea079a0a991de5bde07 not found: ID does not exist" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.963300 4902 scope.go:117] "RemoveContainer" containerID="f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83" Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.966487 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:12 crc kubenswrapper[4902]: I1009 14:09:12.982947 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006004 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006475 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-log" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006496 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-log" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006515 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="sg-core" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006522 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="sg-core" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006530 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-api" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006536 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-api" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006546 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-notification-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006552 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-notification-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006558 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="proxy-httpd" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006564 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="proxy-httpd" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.006577 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-central-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006582 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-central-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006745 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-log" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006758 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-notification-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006772 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="proxy-httpd" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006781 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="sg-core" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006788 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e06434-8159-49be-bef4-3b0646427c20" containerName="ceilometer-central-agent" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.006801 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" containerName="nova-api-api" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.007793 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.007813 4902 scope.go:117] "RemoveContainer" containerID="b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.011244 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.011351 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.011351 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.016091 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.042805 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.051735 4902 scope.go:117] "RemoveContainer" containerID="1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.057150 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.059908 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.059931 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.060864 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.074156 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.076756 4902 scope.go:117] "RemoveContainer" containerID="c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.079180 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.079225 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-scripts\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.079260 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.079390 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-log-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.079908 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-config-data\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080219 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-run-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080269 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080298 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6lf\" (UniqueName: \"kubernetes.io/projected/a4dab6bb-60d0-4984-91e7-2013f341a39d-kube-api-access-jn6lf\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080340 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080362 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080383 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080529 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080582 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.080628 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.116987 4902 scope.go:117] "RemoveContainer" containerID="f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.118195 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83\": container with ID starting with f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83 not found: ID does not exist" containerID="f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.118230 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83"} err="failed to get container status \"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83\": rpc error: code = NotFound desc = could not find container \"f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83\": container with ID starting with f70d91d8df2d3dc2c301a9d70852011dde73d796af26c2a9605b2344192cfe83 not found: ID does not exist" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.118253 4902 scope.go:117] "RemoveContainer" containerID="b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.118655 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0\": container with ID starting with b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0 not found: ID does not exist" containerID="b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.118697 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0"} err="failed to get container status \"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0\": rpc error: code = NotFound desc = could not find container \"b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0\": container with ID starting with b055b9f8d94f743e063474f0284c434bdbd97b2f18260dab590a849198b2f5a0 not found: ID does not exist" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.118725 4902 scope.go:117] "RemoveContainer" containerID="1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.119050 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855\": container with ID starting with 1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855 not found: ID does not exist" containerID="1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.119079 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855"} err="failed to get container status \"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855\": rpc error: code = NotFound desc = could not find container \"1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855\": container with ID starting with 1a347593ce5266cd36f313cdf1caf92dd776ef3cced53130eac9e662504e9855 not found: ID does not exist" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.119093 4902 scope.go:117] "RemoveContainer" containerID="c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4" Oct 09 14:09:13 crc kubenswrapper[4902]: E1009 14:09:13.119279 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4\": container with ID starting with c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4 not found: ID does not exist" containerID="c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.119300 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4"} err="failed to get container status \"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4\": rpc error: code = NotFound desc = could not find container \"c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4\": container with ID starting with c146e4602fb8421300703a64d218dbcbfdfbb8f0e23e6af90d555570933911b4 not found: ID does not exist" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182511 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182572 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-scripts\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182603 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182623 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-log-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182639 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-config-data\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182667 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-run-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182691 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182713 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn6lf\" (UniqueName: \"kubernetes.io/projected/a4dab6bb-60d0-4984-91e7-2013f341a39d-kube-api-access-jn6lf\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182756 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182779 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182793 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182824 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182841 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.182854 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.184006 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-log-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.184431 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4dab6bb-60d0-4984-91e7-2013f341a39d-run-httpd\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.185391 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.187356 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.187713 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-scripts\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.188481 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.188783 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.189108 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.192004 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.194607 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dab6bb-60d0-4984-91e7-2013f341a39d-config-data\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.196952 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.202740 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.203237 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.208149 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn6lf\" (UniqueName: \"kubernetes.io/projected/a4dab6bb-60d0-4984-91e7-2013f341a39d-kube-api-access-jn6lf\") pod \"ceilometer-0\" (UID: \"a4dab6bb-60d0-4984-91e7-2013f341a39d\") " pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.339520 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.388633 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.540345 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e06434-8159-49be-bef4-3b0646427c20" path="/var/lib/kubelet/pods/01e06434-8159-49be-bef4-3b0646427c20/volumes" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.541594 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef02f023-650b-4ffb-8781-55f905a5490d" path="/var/lib/kubelet/pods/ef02f023-650b-4ffb-8781-55f905a5490d/volumes" Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.819040 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: W1009 14:09:13.904759 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4dab6bb_60d0_4984_91e7_2013f341a39d.slice/crio-f2c3235013badb8985a918d2ff77fbebf13cafdecff15ea3106eeccce8400425 WatchSource:0}: Error finding container f2c3235013badb8985a918d2ff77fbebf13cafdecff15ea3106eeccce8400425: Status 404 returned error can't find the container with id f2c3235013badb8985a918d2ff77fbebf13cafdecff15ea3106eeccce8400425 Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.905792 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.922009 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4dab6bb-60d0-4984-91e7-2013f341a39d","Type":"ContainerStarted","Data":"f2c3235013badb8985a918d2ff77fbebf13cafdecff15ea3106eeccce8400425"} Oct 09 14:09:13 crc kubenswrapper[4902]: I1009 14:09:13.923251 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerStarted","Data":"5ba92f00769b781d56e26eb59f50d1081a0d4077ad6fc862437348bcb904de05"} Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.205287 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.228546 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.934169 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4dab6bb-60d0-4984-91e7-2013f341a39d","Type":"ContainerStarted","Data":"bfebacc08645ee9e576a9fb1e425779fc5bdd86c0126847c800975cfbdf5752e"} Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.936065 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerStarted","Data":"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28"} Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.936095 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerStarted","Data":"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423"} Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.952981 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.952965728 podStartE2EDuration="2.952965728s" podCreationTimestamp="2025-10-09 14:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:14.951919057 +0000 UTC m=+1102.149778121" watchObservedRunningTime="2025-10-09 14:09:14.952965728 +0000 UTC m=+1102.150824792" Oct 09 14:09:14 crc kubenswrapper[4902]: I1009 14:09:14.955565 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.122543 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gn99q"] Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.125176 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.127385 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.127583 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.134223 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gn99q"] Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.227843 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.227942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.228178 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.228252 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnp9d\" (UniqueName: \"kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.330064 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.330136 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.330224 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.330249 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnp9d\" (UniqueName: \"kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.337909 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.345053 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.356798 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnp9d\" (UniqueName: \"kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.357231 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts\") pod \"nova-cell1-cell-mapping-gn99q\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.447984 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.883104 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gn99q"] Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.957740 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4dab6bb-60d0-4984-91e7-2013f341a39d","Type":"ContainerStarted","Data":"d6c178a0cf9679dd38e5d9018420ff24b811680b233740c312c3e122023337c9"} Oct 09 14:09:15 crc kubenswrapper[4902]: I1009 14:09:15.961394 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gn99q" event={"ID":"ec866e85-92a2-4934-89a1-c44f1dcf2f52","Type":"ContainerStarted","Data":"94e1baaacc133ac0a60015532f08f664eae054a21d8a5dc7e57a1380063d93cc"} Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.367596 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.443487 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.443724 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="dnsmasq-dns" containerID="cri-o://25be80322b10ec09e7df7ccfe8b41efe33db5305aea9af40873cfc9725d576a0" gracePeriod=10 Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.971234 4902 generic.go:334] "Generic (PLEG): container finished" podID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerID="25be80322b10ec09e7df7ccfe8b41efe33db5305aea9af40873cfc9725d576a0" exitCode=0 Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.971260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" event={"ID":"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a","Type":"ContainerDied","Data":"25be80322b10ec09e7df7ccfe8b41efe33db5305aea9af40873cfc9725d576a0"} Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.971601 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" event={"ID":"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a","Type":"ContainerDied","Data":"5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f"} Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.971617 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f" Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.973807 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4dab6bb-60d0-4984-91e7-2013f341a39d","Type":"ContainerStarted","Data":"8762562f933a211fd9d756966a00299806c32d4a6455b38bd93ab50b4dffdb2f"} Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.975535 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gn99q" event={"ID":"ec866e85-92a2-4934-89a1-c44f1dcf2f52","Type":"ContainerStarted","Data":"c282e06076c15d994c5781dcec81200243f7a4c9430b5342b27458e69a167e60"} Oct 09 14:09:16 crc kubenswrapper[4902]: I1009 14:09:16.976543 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.014823 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gn99q" podStartSLOduration=2.014804938 podStartE2EDuration="2.014804938s" podCreationTimestamp="2025-10-09 14:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:16.995608619 +0000 UTC m=+1104.193467683" watchObservedRunningTime="2025-10-09 14:09:17.014804938 +0000 UTC m=+1104.212663992" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063258 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063362 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gd2x\" (UniqueName: \"kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063384 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063484 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063504 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.063563 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config\") pod \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\" (UID: \"8b7bf8d6-8d86-4b2d-8a31-268090d3e11a\") " Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.081791 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x" (OuterVolumeSpecName: "kube-api-access-8gd2x") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "kube-api-access-8gd2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.112060 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.124168 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config" (OuterVolumeSpecName: "config") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.129200 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.130421 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.135832 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" (UID: "8b7bf8d6-8d86-4b2d-8a31-268090d3e11a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165392 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gd2x\" (UniqueName: \"kubernetes.io/projected/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-kube-api-access-8gd2x\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165452 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165464 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165472 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165484 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.165494 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:17 crc kubenswrapper[4902]: E1009 14:09:17.685517 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7bf8d6_8d86_4b2d_8a31_268090d3e11a.slice/crio-5f28514b0980442e544342767198e25677af92740d4a02b45d0f03a1a228fc2f\": RecentStats: unable to find data in memory cache]" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.987121 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-425tz" Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.988699 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4dab6bb-60d0-4984-91e7-2013f341a39d","Type":"ContainerStarted","Data":"a00c8fe002d7644c7711b005b674b79b7dafd49dd932a9efaa3690c49cf5f3d8"} Oct 09 14:09:17 crc kubenswrapper[4902]: I1009 14:09:17.988754 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 14:09:18 crc kubenswrapper[4902]: I1009 14:09:18.019134 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215579926 podStartE2EDuration="6.01911394s" podCreationTimestamp="2025-10-09 14:09:12 +0000 UTC" firstStartedPulling="2025-10-09 14:09:13.907509356 +0000 UTC m=+1101.105368420" lastFinishedPulling="2025-10-09 14:09:17.71104337 +0000 UTC m=+1104.908902434" observedRunningTime="2025-10-09 14:09:18.009629969 +0000 UTC m=+1105.207489053" watchObservedRunningTime="2025-10-09 14:09:18.01911394 +0000 UTC m=+1105.216973024" Oct 09 14:09:18 crc kubenswrapper[4902]: I1009 14:09:18.033922 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:09:18 crc kubenswrapper[4902]: I1009 14:09:18.042189 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-425tz"] Oct 09 14:09:19 crc kubenswrapper[4902]: I1009 14:09:19.525173 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" path="/var/lib/kubelet/pods/8b7bf8d6-8d86-4b2d-8a31-268090d3e11a/volumes" Oct 09 14:09:20 crc kubenswrapper[4902]: I1009 14:09:20.078836 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:09:20 crc kubenswrapper[4902]: I1009 14:09:20.078908 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:09:21 crc kubenswrapper[4902]: I1009 14:09:21.020039 4902 generic.go:334] "Generic (PLEG): container finished" podID="ec866e85-92a2-4934-89a1-c44f1dcf2f52" containerID="c282e06076c15d994c5781dcec81200243f7a4c9430b5342b27458e69a167e60" exitCode=0 Oct 09 14:09:21 crc kubenswrapper[4902]: I1009 14:09:21.020110 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gn99q" event={"ID":"ec866e85-92a2-4934-89a1-c44f1dcf2f52","Type":"ContainerDied","Data":"c282e06076c15d994c5781dcec81200243f7a4c9430b5342b27458e69a167e60"} Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.367959 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.471081 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle\") pod \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.471367 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data\") pod \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.471540 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts\") pod \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.471581 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnp9d\" (UniqueName: \"kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d\") pod \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\" (UID: \"ec866e85-92a2-4934-89a1-c44f1dcf2f52\") " Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.476557 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts" (OuterVolumeSpecName: "scripts") pod "ec866e85-92a2-4934-89a1-c44f1dcf2f52" (UID: "ec866e85-92a2-4934-89a1-c44f1dcf2f52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.477786 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d" (OuterVolumeSpecName: "kube-api-access-cnp9d") pod "ec866e85-92a2-4934-89a1-c44f1dcf2f52" (UID: "ec866e85-92a2-4934-89a1-c44f1dcf2f52"). InnerVolumeSpecName "kube-api-access-cnp9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.496579 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec866e85-92a2-4934-89a1-c44f1dcf2f52" (UID: "ec866e85-92a2-4934-89a1-c44f1dcf2f52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.506620 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data" (OuterVolumeSpecName: "config-data") pod "ec866e85-92a2-4934-89a1-c44f1dcf2f52" (UID: "ec866e85-92a2-4934-89a1-c44f1dcf2f52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.572979 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.573174 4902 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.573247 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnp9d\" (UniqueName: \"kubernetes.io/projected/ec866e85-92a2-4934-89a1-c44f1dcf2f52-kube-api-access-cnp9d\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:22 crc kubenswrapper[4902]: I1009 14:09:22.573312 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec866e85-92a2-4934-89a1-c44f1dcf2f52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.047179 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gn99q" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.048955 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gn99q" event={"ID":"ec866e85-92a2-4934-89a1-c44f1dcf2f52","Type":"ContainerDied","Data":"94e1baaacc133ac0a60015532f08f664eae054a21d8a5dc7e57a1380063d93cc"} Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.049359 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e1baaacc133ac0a60015532f08f664eae054a21d8a5dc7e57a1380063d93cc" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.250082 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.250450 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-api" containerID="cri-o://11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" gracePeriod=30 Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.250908 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-log" containerID="cri-o://080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" gracePeriod=30 Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.264906 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.265166 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d9050747-0e34-4263-b886-00511d489a2f" containerName="nova-scheduler-scheduler" containerID="cri-o://412c4a22357a0ca07053ab5fc3389de18a622ee434f52be6354c10e0beb996fc" gracePeriod=30 Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.278587 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.279053 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" containerID="cri-o://d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93" gracePeriod=30 Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.279091 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" containerID="cri-o://aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add" gracePeriod=30 Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.843180 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902006 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902057 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902181 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902210 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902368 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902390 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs" (OuterVolumeSpecName: "logs") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902450 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq\") pod \"b67b5cec-9175-478d-b323-039843e49260\" (UID: \"b67b5cec-9175-478d-b323-039843e49260\") " Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.902968 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67b5cec-9175-478d-b323-039843e49260-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.905902 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq" (OuterVolumeSpecName: "kube-api-access-xdjpq") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "kube-api-access-xdjpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.940068 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data" (OuterVolumeSpecName: "config-data") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.944108 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.958159 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:23 crc kubenswrapper[4902]: I1009 14:09:23.962153 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b67b5cec-9175-478d-b323-039843e49260" (UID: "b67b5cec-9175-478d-b323-039843e49260"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.004617 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.004660 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/b67b5cec-9175-478d-b323-039843e49260-kube-api-access-xdjpq\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.004675 4902 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.004686 4902 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.004698 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67b5cec-9175-478d-b323-039843e49260-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058818 4902 generic.go:334] "Generic (PLEG): container finished" podID="b67b5cec-9175-478d-b323-039843e49260" containerID="11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" exitCode=0 Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058855 4902 generic.go:334] "Generic (PLEG): container finished" podID="b67b5cec-9175-478d-b323-039843e49260" containerID="080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" exitCode=143 Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058889 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058898 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerDied","Data":"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28"} Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058944 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerDied","Data":"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423"} Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058954 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b67b5cec-9175-478d-b323-039843e49260","Type":"ContainerDied","Data":"5ba92f00769b781d56e26eb59f50d1081a0d4077ad6fc862437348bcb904de05"} Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.058970 4902 scope.go:117] "RemoveContainer" containerID="11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.060355 4902 generic.go:334] "Generic (PLEG): container finished" podID="d9050747-0e34-4263-b886-00511d489a2f" containerID="412c4a22357a0ca07053ab5fc3389de18a622ee434f52be6354c10e0beb996fc" exitCode=0 Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.060452 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9050747-0e34-4263-b886-00511d489a2f","Type":"ContainerDied","Data":"412c4a22357a0ca07053ab5fc3389de18a622ee434f52be6354c10e0beb996fc"} Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.070008 4902 generic.go:334] "Generic (PLEG): container finished" podID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerID="aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add" exitCode=143 Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.070197 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerDied","Data":"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add"} Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.115503 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.126821 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.132240 4902 scope.go:117] "RemoveContainer" containerID="080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.135221 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.136222 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="dnsmasq-dns" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136246 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="dnsmasq-dns" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.136271 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-api" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136278 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-api" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.136293 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec866e85-92a2-4934-89a1-c44f1dcf2f52" containerName="nova-manage" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136300 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec866e85-92a2-4934-89a1-c44f1dcf2f52" containerName="nova-manage" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.136313 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-log" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136319 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-log" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.136339 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="init" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136345 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="init" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136571 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec866e85-92a2-4934-89a1-c44f1dcf2f52" containerName="nova-manage" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136586 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7bf8d6-8d86-4b2d-8a31-268090d3e11a" containerName="dnsmasq-dns" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136601 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-api" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.136616 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67b5cec-9175-478d-b323-039843e49260" containerName="nova-api-log" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.137708 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.139953 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.140368 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.140706 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.145097 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.162707 4902 scope.go:117] "RemoveContainer" containerID="11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.163361 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28\": container with ID starting with 11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28 not found: ID does not exist" containerID="11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.163398 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28"} err="failed to get container status \"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28\": rpc error: code = NotFound desc = could not find container \"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28\": container with ID starting with 11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28 not found: ID does not exist" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.163439 4902 scope.go:117] "RemoveContainer" containerID="080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" Oct 09 14:09:24 crc kubenswrapper[4902]: E1009 14:09:24.163659 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423\": container with ID starting with 080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423 not found: ID does not exist" containerID="080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.163704 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423"} err="failed to get container status \"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423\": rpc error: code = NotFound desc = could not find container \"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423\": container with ID starting with 080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423 not found: ID does not exist" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.163728 4902 scope.go:117] "RemoveContainer" containerID="11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.164128 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28"} err="failed to get container status \"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28\": rpc error: code = NotFound desc = could not find container \"11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28\": container with ID starting with 11f68f07a72016fbc81e6f612b4138ad3a652776ace8319f4d99e729322dcb28 not found: ID does not exist" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.164152 4902 scope.go:117] "RemoveContainer" containerID="080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.164467 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423"} err="failed to get container status \"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423\": rpc error: code = NotFound desc = could not find container \"080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423\": container with ID starting with 080341e98639b734ffa24cdc2b4fcd18922e546994ad4b63f9d29eb0cc037423 not found: ID does not exist" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.215671 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.215740 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.215785 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a597676-c413-4919-a79b-ac49dd2671c2-logs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.215942 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsp4k\" (UniqueName: \"kubernetes.io/projected/7a597676-c413-4919-a79b-ac49dd2671c2-kube-api-access-hsp4k\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.215975 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-config-data\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.216018 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.306967 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318380 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsp4k\" (UniqueName: \"kubernetes.io/projected/7a597676-c413-4919-a79b-ac49dd2671c2-kube-api-access-hsp4k\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318462 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-config-data\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318515 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318599 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318614 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.318678 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a597676-c413-4919-a79b-ac49dd2671c2-logs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.319595 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a597676-c413-4919-a79b-ac49dd2671c2-logs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.326669 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-config-data\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.327163 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-public-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.327388 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.333188 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a597676-c413-4919-a79b-ac49dd2671c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.344348 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsp4k\" (UniqueName: \"kubernetes.io/projected/7a597676-c413-4919-a79b-ac49dd2671c2-kube-api-access-hsp4k\") pod \"nova-api-0\" (UID: \"7a597676-c413-4919-a79b-ac49dd2671c2\") " pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.419905 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8lss\" (UniqueName: \"kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss\") pod \"d9050747-0e34-4263-b886-00511d489a2f\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.420120 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle\") pod \"d9050747-0e34-4263-b886-00511d489a2f\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.420215 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data\") pod \"d9050747-0e34-4263-b886-00511d489a2f\" (UID: \"d9050747-0e34-4263-b886-00511d489a2f\") " Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.423536 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss" (OuterVolumeSpecName: "kube-api-access-d8lss") pod "d9050747-0e34-4263-b886-00511d489a2f" (UID: "d9050747-0e34-4263-b886-00511d489a2f"). InnerVolumeSpecName "kube-api-access-d8lss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.445922 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9050747-0e34-4263-b886-00511d489a2f" (UID: "d9050747-0e34-4263-b886-00511d489a2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.449388 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data" (OuterVolumeSpecName: "config-data") pod "d9050747-0e34-4263-b886-00511d489a2f" (UID: "d9050747-0e34-4263-b886-00511d489a2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.464012 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.522733 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8lss\" (UniqueName: \"kubernetes.io/projected/d9050747-0e34-4263-b886-00511d489a2f-kube-api-access-d8lss\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.522765 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.522775 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9050747-0e34-4263-b886-00511d489a2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:24 crc kubenswrapper[4902]: I1009 14:09:24.910541 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.084741 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.084740 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9050747-0e34-4263-b886-00511d489a2f","Type":"ContainerDied","Data":"406b5695518556020343fa3a94e2b90eeaf54136fd2d4960dcd16ba18773a05e"} Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.085169 4902 scope.go:117] "RemoveContainer" containerID="412c4a22357a0ca07053ab5fc3389de18a622ee434f52be6354c10e0beb996fc" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.087246 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a597676-c413-4919-a79b-ac49dd2671c2","Type":"ContainerStarted","Data":"08af43073278bfd21e13aee387df545d5b51238c3a0448d7e2d64e211bb72f97"} Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.135558 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.155399 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.164827 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: E1009 14:09:25.165489 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9050747-0e34-4263-b886-00511d489a2f" containerName="nova-scheduler-scheduler" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.165508 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9050747-0e34-4263-b886-00511d489a2f" containerName="nova-scheduler-scheduler" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.165728 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9050747-0e34-4263-b886-00511d489a2f" containerName="nova-scheduler-scheduler" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.166596 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.169631 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.172551 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.237814 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.238721 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.238778 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bzqw\" (UniqueName: \"kubernetes.io/projected/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-kube-api-access-7bzqw\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.341369 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.341472 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bzqw\" (UniqueName: \"kubernetes.io/projected/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-kube-api-access-7bzqw\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.341517 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.345245 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.346081 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-config-data\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.360800 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bzqw\" (UniqueName: \"kubernetes.io/projected/b1206eaa-959a-4a3d-8c35-a60fc09bb3d5-kube-api-access-7bzqw\") pod \"nova-scheduler-0\" (UID: \"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5\") " pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.492300 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.529962 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67b5cec-9175-478d-b323-039843e49260" path="/var/lib/kubelet/pods/b67b5cec-9175-478d-b323-039843e49260/volumes" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.530959 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9050747-0e34-4263-b886-00511d489a2f" path="/var/lib/kubelet/pods/d9050747-0e34-4263-b886-00511d489a2f/volumes" Oct 09 14:09:25 crc kubenswrapper[4902]: I1009 14:09:25.950916 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 14:09:25 crc kubenswrapper[4902]: W1009 14:09:25.957481 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1206eaa_959a_4a3d_8c35_a60fc09bb3d5.slice/crio-80adf242aa18aff38e7c2b224234e8366fb16065150cd017c277174285f8eecf WatchSource:0}: Error finding container 80adf242aa18aff38e7c2b224234e8366fb16065150cd017c277174285f8eecf: Status 404 returned error can't find the container with id 80adf242aa18aff38e7c2b224234e8366fb16065150cd017c277174285f8eecf Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.098740 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5","Type":"ContainerStarted","Data":"80adf242aa18aff38e7c2b224234e8366fb16065150cd017c277174285f8eecf"} Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.102208 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a597676-c413-4919-a79b-ac49dd2671c2","Type":"ContainerStarted","Data":"027c59fcded35ae84243ba926d3293416d40f69c1beb1e61df69d7a6fcda1da4"} Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.102248 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a597676-c413-4919-a79b-ac49dd2671c2","Type":"ContainerStarted","Data":"60a71ac0afb8d7a63a907e84a3bc8583e673b44be2415c913942c9236c01c53b"} Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.130801 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.130776871 podStartE2EDuration="2.130776871s" podCreationTimestamp="2025-10-09 14:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:26.120869047 +0000 UTC m=+1113.318728121" watchObservedRunningTime="2025-10-09 14:09:26.130776871 +0000 UTC m=+1113.328635935" Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.424841 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58742->10.217.0.196:8775: read: connection reset by peer" Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.424938 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58740->10.217.0.196:8775: read: connection reset by peer" Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.887670 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.976962 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs\") pod \"d52f0e92-a10a-4111-98fb-f83da483d1f1\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.977093 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle\") pod \"d52f0e92-a10a-4111-98fb-f83da483d1f1\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.977209 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data\") pod \"d52f0e92-a10a-4111-98fb-f83da483d1f1\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.977870 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs\") pod \"d52f0e92-a10a-4111-98fb-f83da483d1f1\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.977930 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vxxz\" (UniqueName: \"kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz\") pod \"d52f0e92-a10a-4111-98fb-f83da483d1f1\" (UID: \"d52f0e92-a10a-4111-98fb-f83da483d1f1\") " Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.979297 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs" (OuterVolumeSpecName: "logs") pod "d52f0e92-a10a-4111-98fb-f83da483d1f1" (UID: "d52f0e92-a10a-4111-98fb-f83da483d1f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:09:26 crc kubenswrapper[4902]: I1009 14:09:26.989581 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz" (OuterVolumeSpecName: "kube-api-access-2vxxz") pod "d52f0e92-a10a-4111-98fb-f83da483d1f1" (UID: "d52f0e92-a10a-4111-98fb-f83da483d1f1"). InnerVolumeSpecName "kube-api-access-2vxxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.010205 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d52f0e92-a10a-4111-98fb-f83da483d1f1" (UID: "d52f0e92-a10a-4111-98fb-f83da483d1f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.011808 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data" (OuterVolumeSpecName: "config-data") pod "d52f0e92-a10a-4111-98fb-f83da483d1f1" (UID: "d52f0e92-a10a-4111-98fb-f83da483d1f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.038251 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d52f0e92-a10a-4111-98fb-f83da483d1f1" (UID: "d52f0e92-a10a-4111-98fb-f83da483d1f1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.080014 4902 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.080065 4902 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.080075 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52f0e92-a10a-4111-98fb-f83da483d1f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.080086 4902 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d52f0e92-a10a-4111-98fb-f83da483d1f1-logs\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.080095 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vxxz\" (UniqueName: \"kubernetes.io/projected/d52f0e92-a10a-4111-98fb-f83da483d1f1-kube-api-access-2vxxz\") on node \"crc\" DevicePath \"\"" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.114089 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1206eaa-959a-4a3d-8c35-a60fc09bb3d5","Type":"ContainerStarted","Data":"eedd696ab9e39f5ed365879c96df16fb3d826bf30bedf9f67bed41cfdd7049e8"} Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.117321 4902 generic.go:334] "Generic (PLEG): container finished" podID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerID="d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93" exitCode=0 Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.117443 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.117455 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerDied","Data":"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93"} Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.117803 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d52f0e92-a10a-4111-98fb-f83da483d1f1","Type":"ContainerDied","Data":"d5e6934f25710c9526192e44e4d1293f9f3a281032e46db8d17a7818815e35b9"} Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.117839 4902 scope.go:117] "RemoveContainer" containerID="d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.131561 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.131544008 podStartE2EDuration="2.131544008s" podCreationTimestamp="2025-10-09 14:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:27.130918659 +0000 UTC m=+1114.328777733" watchObservedRunningTime="2025-10-09 14:09:27.131544008 +0000 UTC m=+1114.329403062" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.143581 4902 scope.go:117] "RemoveContainer" containerID="aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.191532 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.195749 4902 scope.go:117] "RemoveContainer" containerID="d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93" Oct 09 14:09:27 crc kubenswrapper[4902]: E1009 14:09:27.196101 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93\": container with ID starting with d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93 not found: ID does not exist" containerID="d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.196145 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93"} err="failed to get container status \"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93\": rpc error: code = NotFound desc = could not find container \"d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93\": container with ID starting with d92c243b1c15acea6f0efd8f257c08f1bf38b6ed3578786c36d3ae45fafe0b93 not found: ID does not exist" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.196175 4902 scope.go:117] "RemoveContainer" containerID="aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add" Oct 09 14:09:27 crc kubenswrapper[4902]: E1009 14:09:27.196375 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add\": container with ID starting with aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add not found: ID does not exist" containerID="aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.196394 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add"} err="failed to get container status \"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add\": rpc error: code = NotFound desc = could not find container \"aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add\": container with ID starting with aff06d1879c9e009f041ef127836daa5bae2af584f89fe6d1600844eb9b77add not found: ID does not exist" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.204708 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.217678 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:27 crc kubenswrapper[4902]: E1009 14:09:27.218240 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.218265 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" Oct 09 14:09:27 crc kubenswrapper[4902]: E1009 14:09:27.218282 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.218288 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.218478 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-metadata" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.218499 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" containerName="nova-metadata-log" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.219592 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.221770 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.221934 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.231608 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.284606 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-config-data\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.284663 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.284713 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlf7p\" (UniqueName: \"kubernetes.io/projected/a272cc22-f1dc-48b7-89ef-e4578877aa78-kube-api-access-nlf7p\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.284768 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a272cc22-f1dc-48b7-89ef-e4578877aa78-logs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.284794 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.386799 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-config-data\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.386845 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.386883 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlf7p\" (UniqueName: \"kubernetes.io/projected/a272cc22-f1dc-48b7-89ef-e4578877aa78-kube-api-access-nlf7p\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.386907 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a272cc22-f1dc-48b7-89ef-e4578877aa78-logs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.386927 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.387400 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a272cc22-f1dc-48b7-89ef-e4578877aa78-logs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.392070 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.392998 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.396227 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a272cc22-f1dc-48b7-89ef-e4578877aa78-config-data\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.421957 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlf7p\" (UniqueName: \"kubernetes.io/projected/a272cc22-f1dc-48b7-89ef-e4578877aa78-kube-api-access-nlf7p\") pod \"nova-metadata-0\" (UID: \"a272cc22-f1dc-48b7-89ef-e4578877aa78\") " pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.525357 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52f0e92-a10a-4111-98fb-f83da483d1f1" path="/var/lib/kubelet/pods/d52f0e92-a10a-4111-98fb-f83da483d1f1/volumes" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.542613 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 14:09:27 crc kubenswrapper[4902]: I1009 14:09:27.976210 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 14:09:27 crc kubenswrapper[4902]: W1009 14:09:27.981514 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda272cc22_f1dc_48b7_89ef_e4578877aa78.slice/crio-b8c8d8d775059ebf8db05268cba721e1334686d10bcac1412bad303839550e4d WatchSource:0}: Error finding container b8c8d8d775059ebf8db05268cba721e1334686d10bcac1412bad303839550e4d: Status 404 returned error can't find the container with id b8c8d8d775059ebf8db05268cba721e1334686d10bcac1412bad303839550e4d Oct 09 14:09:28 crc kubenswrapper[4902]: I1009 14:09:28.132973 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a272cc22-f1dc-48b7-89ef-e4578877aa78","Type":"ContainerStarted","Data":"b8c8d8d775059ebf8db05268cba721e1334686d10bcac1412bad303839550e4d"} Oct 09 14:09:29 crc kubenswrapper[4902]: I1009 14:09:29.145006 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a272cc22-f1dc-48b7-89ef-e4578877aa78","Type":"ContainerStarted","Data":"4193440684f5a2e35a5671b6e1352ab7295d3192db2d71d009da893e5072b8f5"} Oct 09 14:09:29 crc kubenswrapper[4902]: I1009 14:09:29.145449 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a272cc22-f1dc-48b7-89ef-e4578877aa78","Type":"ContainerStarted","Data":"e3a91c1e47c18a675c6f5564cf1c2ba1d7c1937c9ea7a48b3d96c8b1bfcdd737"} Oct 09 14:09:29 crc kubenswrapper[4902]: I1009 14:09:29.178248 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.178224338 podStartE2EDuration="2.178224338s" podCreationTimestamp="2025-10-09 14:09:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:09:29.168544641 +0000 UTC m=+1116.366403705" watchObservedRunningTime="2025-10-09 14:09:29.178224338 +0000 UTC m=+1116.376083422" Oct 09 14:09:30 crc kubenswrapper[4902]: I1009 14:09:30.492533 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 14:09:32 crc kubenswrapper[4902]: I1009 14:09:32.543253 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:09:32 crc kubenswrapper[4902]: I1009 14:09:32.544514 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 14:09:34 crc kubenswrapper[4902]: I1009 14:09:34.464664 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:09:34 crc kubenswrapper[4902]: I1009 14:09:34.465637 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 14:09:35 crc kubenswrapper[4902]: I1009 14:09:35.479486 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a597676-c413-4919-a79b-ac49dd2671c2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:09:35 crc kubenswrapper[4902]: I1009 14:09:35.479460 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a597676-c413-4919-a79b-ac49dd2671c2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:09:35 crc kubenswrapper[4902]: I1009 14:09:35.492974 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 14:09:35 crc kubenswrapper[4902]: I1009 14:09:35.524740 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 14:09:36 crc kubenswrapper[4902]: I1009 14:09:36.279795 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 14:09:37 crc kubenswrapper[4902]: I1009 14:09:37.543350 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 14:09:37 crc kubenswrapper[4902]: I1009 14:09:37.543452 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 14:09:38 crc kubenswrapper[4902]: I1009 14:09:38.555749 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a272cc22-f1dc-48b7-89ef-e4578877aa78" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:09:38 crc kubenswrapper[4902]: I1009 14:09:38.555769 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a272cc22-f1dc-48b7-89ef-e4578877aa78" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 14:09:43 crc kubenswrapper[4902]: I1009 14:09:43.397900 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 14:09:44 crc kubenswrapper[4902]: I1009 14:09:44.472305 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 14:09:44 crc kubenswrapper[4902]: I1009 14:09:44.473094 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 14:09:44 crc kubenswrapper[4902]: I1009 14:09:44.475111 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 14:09:44 crc kubenswrapper[4902]: I1009 14:09:44.480776 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 14:09:45 crc kubenswrapper[4902]: I1009 14:09:45.311773 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 14:09:45 crc kubenswrapper[4902]: I1009 14:09:45.321967 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 14:09:47 crc kubenswrapper[4902]: I1009 14:09:47.549090 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 14:09:47 crc kubenswrapper[4902]: I1009 14:09:47.553352 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 14:09:47 crc kubenswrapper[4902]: I1009 14:09:47.557645 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 14:09:48 crc kubenswrapper[4902]: I1009 14:09:48.342015 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 14:09:50 crc kubenswrapper[4902]: I1009 14:09:50.078692 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:09:50 crc kubenswrapper[4902]: I1009 14:09:50.079017 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:09:56 crc kubenswrapper[4902]: I1009 14:09:56.763937 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:09:58 crc kubenswrapper[4902]: I1009 14:09:58.157362 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:00 crc kubenswrapper[4902]: I1009 14:10:00.824019 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="rabbitmq" containerID="cri-o://b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39" gracePeriod=604796 Oct 09 14:10:02 crc kubenswrapper[4902]: I1009 14:10:02.201527 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="rabbitmq" containerID="cri-o://d4e54c02be4739d4a0a45ec73f982c9c598bfcc7dbb11b79b2c4d20783e30929" gracePeriod=604796 Oct 09 14:10:04 crc kubenswrapper[4902]: I1009 14:10:04.610884 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 09 14:10:04 crc kubenswrapper[4902]: I1009 14:10:04.942830 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.397540 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.511281 4902 generic.go:334] "Generic (PLEG): container finished" podID="c9c6af38-1605-4d47-bc0c-967053235667" containerID="b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39" exitCode=0 Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.511333 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerDied","Data":"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39"} Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.511360 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c9c6af38-1605-4d47-bc0c-967053235667","Type":"ContainerDied","Data":"af4b54366c36b58e0338fb3bc8fbd62ae03622c593f03b9533fe5b96162932c4"} Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.511380 4902 scope.go:117] "RemoveContainer" containerID="b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.511559 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528165 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528226 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528350 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tj74\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528390 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528463 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528498 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528525 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528548 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528659 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528774 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.528834 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls\") pod \"c9c6af38-1605-4d47-bc0c-967053235667\" (UID: \"c9c6af38-1605-4d47-bc0c-967053235667\") " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.531743 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.532064 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.535714 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.537121 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74" (OuterVolumeSpecName: "kube-api-access-5tj74") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "kube-api-access-5tj74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.541265 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.541291 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info" (OuterVolumeSpecName: "pod-info") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.541448 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.555073 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.566101 4902 scope.go:117] "RemoveContainer" containerID="c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.609949 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data" (OuterVolumeSpecName: "config-data") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.618217 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf" (OuterVolumeSpecName: "server-conf") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631229 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tj74\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-kube-api-access-5tj74\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631262 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631272 4902 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c9c6af38-1605-4d47-bc0c-967053235667-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631280 4902 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c9c6af38-1605-4d47-bc0c-967053235667-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631300 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631308 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631317 4902 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631327 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631335 4902 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c9c6af38-1605-4d47-bc0c-967053235667-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.631346 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.679902 4902 scope.go:117] "RemoveContainer" containerID="b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39" Oct 09 14:10:07 crc kubenswrapper[4902]: E1009 14:10:07.681056 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39\": container with ID starting with b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39 not found: ID does not exist" containerID="b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.681112 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39"} err="failed to get container status \"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39\": rpc error: code = NotFound desc = could not find container \"b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39\": container with ID starting with b4d25f2956983a3a7abc8912c25271e17745a4725d4f8eac3fd07ba50e390c39 not found: ID does not exist" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.681142 4902 scope.go:117] "RemoveContainer" containerID="c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab" Oct 09 14:10:07 crc kubenswrapper[4902]: E1009 14:10:07.681649 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab\": container with ID starting with c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab not found: ID does not exist" containerID="c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.681685 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab"} err="failed to get container status \"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab\": rpc error: code = NotFound desc = could not find container \"c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab\": container with ID starting with c5bda7339b716d82f1eeabbe60fece1807c197c31bdcab9c371b064bef49bbab not found: ID does not exist" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.687230 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.708809 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c9c6af38-1605-4d47-bc0c-967053235667" (UID: "c9c6af38-1605-4d47-bc0c-967053235667"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.732572 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c9c6af38-1605-4d47-bc0c-967053235667-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.732607 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.859312 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.868335 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.884908 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:10:07 crc kubenswrapper[4902]: E1009 14:10:07.885268 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="rabbitmq" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.885287 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="rabbitmq" Oct 09 14:10:07 crc kubenswrapper[4902]: E1009 14:10:07.885323 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="setup-container" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.885330 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="setup-container" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.885542 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9c6af38-1605-4d47-bc0c-967053235667" containerName="rabbitmq" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.886455 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.890629 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.890909 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.891021 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.891060 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.891068 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.891235 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6qng5" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.891352 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 14:10:07 crc kubenswrapper[4902]: I1009 14:10:07.903149 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038460 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038539 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038743 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038794 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038812 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038833 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0f31142-f615-421c-a863-1603f1cb31a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.038990 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0f31142-f615-421c-a863-1603f1cb31a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.039068 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckv2\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-kube-api-access-wckv2\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.039107 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.039135 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.039188 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140650 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140702 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140770 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140793 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140816 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140840 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0f31142-f615-421c-a863-1603f1cb31a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140885 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0f31142-f615-421c-a863-1603f1cb31a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140920 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckv2\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-kube-api-access-wckv2\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140951 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.140979 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.141014 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.141237 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.141521 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.141759 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.141776 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.142950 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.143222 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0f31142-f615-421c-a863-1603f1cb31a0-config-data\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.145139 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0f31142-f615-421c-a863-1603f1cb31a0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.146072 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.146205 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.146666 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0f31142-f615-421c-a863-1603f1cb31a0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.164804 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckv2\" (UniqueName: \"kubernetes.io/projected/f0f31142-f615-421c-a863-1603f1cb31a0-kube-api-access-wckv2\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.201473 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"f0f31142-f615-421c-a863-1603f1cb31a0\") " pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.207960 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.522798 4902 generic.go:334] "Generic (PLEG): container finished" podID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerID="d4e54c02be4739d4a0a45ec73f982c9c598bfcc7dbb11b79b2c4d20783e30929" exitCode=0 Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.523011 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerDied","Data":"d4e54c02be4739d4a0a45ec73f982c9c598bfcc7dbb11b79b2c4d20783e30929"} Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.673518 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.709049 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860036 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860075 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860097 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvw9p\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860160 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860180 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860287 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860310 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860335 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860351 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860377 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.860468 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data\") pod \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\" (UID: \"5cdfacc8-b636-448a-bdc9-30b7a851aa8f\") " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.861265 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.861343 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.861887 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.862215 4902 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.862232 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.862246 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.863474 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.864891 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.866996 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info" (OuterVolumeSpecName: "pod-info") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.867221 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.870183 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p" (OuterVolumeSpecName: "kube-api-access-lvw9p") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "kube-api-access-lvw9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.903565 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data" (OuterVolumeSpecName: "config-data") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.920891 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf" (OuterVolumeSpecName: "server-conf") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964626 4902 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964681 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964692 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964700 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964709 4902 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964718 4902 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.964726 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvw9p\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-kube-api-access-lvw9p\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.985818 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 09 14:10:08 crc kubenswrapper[4902]: I1009 14:10:08.986750 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5cdfacc8-b636-448a-bdc9-30b7a851aa8f" (UID: "5cdfacc8-b636-448a-bdc9-30b7a851aa8f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.067359 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.067398 4902 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5cdfacc8-b636-448a-bdc9-30b7a851aa8f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.525900 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9c6af38-1605-4d47-bc0c-967053235667" path="/var/lib/kubelet/pods/c9c6af38-1605-4d47-bc0c-967053235667/volumes" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.539299 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5cdfacc8-b636-448a-bdc9-30b7a851aa8f","Type":"ContainerDied","Data":"88b16d0cb0aa583376248d4f0c7c1364f71053d6d2971266163d1819d0305376"} Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.539329 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.539346 4902 scope.go:117] "RemoveContainer" containerID="d4e54c02be4739d4a0a45ec73f982c9c598bfcc7dbb11b79b2c4d20783e30929" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.541493 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0f31142-f615-421c-a863-1603f1cb31a0","Type":"ContainerStarted","Data":"1cf8c656d478b107e0e6701ff83600a4b0f6c54536424eece460360925fff605"} Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.565779 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.569964 4902 scope.go:117] "RemoveContainer" containerID="804388511a83b9bbcfe911cb67ebf26bcf159b46ac05551143b014d995792cd7" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.574980 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.626451 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:09 crc kubenswrapper[4902]: E1009 14:10:09.627161 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="rabbitmq" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.627179 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="rabbitmq" Oct 09 14:10:09 crc kubenswrapper[4902]: E1009 14:10:09.627212 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="setup-container" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.627239 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="setup-container" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.627506 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" containerName="rabbitmq" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.629628 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.634979 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635037 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635096 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635103 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635220 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635363 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635480 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.635508 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbwcp" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779113 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779151 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2glgk\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-kube-api-access-2glgk\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779170 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779242 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779292 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779307 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779329 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779385 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779456 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779484 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.779513 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880712 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880753 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880781 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880826 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880850 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2glgk\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-kube-api-access-2glgk\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880875 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.880985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881025 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881047 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881078 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881129 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881383 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.881756 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.882164 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.882188 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.882254 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.882617 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.899977 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2glgk\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-kube-api-access-2glgk\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.911624 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.915320 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.915489 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.915995 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.921195 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef20e2e8-fcf0-438a-80a3-fd50db544b6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ef20e2e8-fcf0-438a-80a3-fd50db544b6e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:09 crc kubenswrapper[4902]: I1009 14:10:09.978369 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.554529 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0f31142-f615-421c-a863-1603f1cb31a0","Type":"ContainerStarted","Data":"924c3047bfef6697c22357d2bf2c5a2fcd21530765d6086875037db86dd60f68"} Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.583921 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.585822 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.588664 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.624676 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:10 crc kubenswrapper[4902]: W1009 14:10:10.680210 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef20e2e8_fcf0_438a_80a3_fd50db544b6e.slice/crio-424089a1b1c259186d280916e8738cf010bda8ef65194a23e6576c1c62457c13 WatchSource:0}: Error finding container 424089a1b1c259186d280916e8738cf010bda8ef65194a23e6576c1c62457c13: Status 404 returned error can't find the container with id 424089a1b1c259186d280916e8738cf010bda8ef65194a23e6576c1c62457c13 Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.680883 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696575 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696701 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696736 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696764 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696782 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696795 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.696810 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmpc\" (UniqueName: \"kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798255 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798370 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798396 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798450 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798476 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798497 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.798516 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmpc\" (UniqueName: \"kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.799327 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.799346 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.799356 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.799350 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.799430 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.800037 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.818615 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmpc\" (UniqueName: \"kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc\") pod \"dnsmasq-dns-67b789f86c-7pvrm\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:10 crc kubenswrapper[4902]: I1009 14:10:10.929051 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:11 crc kubenswrapper[4902]: W1009 14:10:11.354860 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2eeda14_398c_4a24_b25e_d7fd737e3e35.slice/crio-0865aabb9e0fa6a6d3b8e9762404e05ad2b32173e55a41f6f3488342c6b3e787 WatchSource:0}: Error finding container 0865aabb9e0fa6a6d3b8e9762404e05ad2b32173e55a41f6f3488342c6b3e787: Status 404 returned error can't find the container with id 0865aabb9e0fa6a6d3b8e9762404e05ad2b32173e55a41f6f3488342c6b3e787 Oct 09 14:10:11 crc kubenswrapper[4902]: I1009 14:10:11.358391 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:11 crc kubenswrapper[4902]: I1009 14:10:11.527745 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdfacc8-b636-448a-bdc9-30b7a851aa8f" path="/var/lib/kubelet/pods/5cdfacc8-b636-448a-bdc9-30b7a851aa8f/volumes" Oct 09 14:10:11 crc kubenswrapper[4902]: I1009 14:10:11.571418 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" event={"ID":"f2eeda14-398c-4a24-b25e-d7fd737e3e35","Type":"ContainerStarted","Data":"0865aabb9e0fa6a6d3b8e9762404e05ad2b32173e55a41f6f3488342c6b3e787"} Oct 09 14:10:11 crc kubenswrapper[4902]: I1009 14:10:11.572997 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef20e2e8-fcf0-438a-80a3-fd50db544b6e","Type":"ContainerStarted","Data":"424089a1b1c259186d280916e8738cf010bda8ef65194a23e6576c1c62457c13"} Oct 09 14:10:12 crc kubenswrapper[4902]: I1009 14:10:12.585827 4902 generic.go:334] "Generic (PLEG): container finished" podID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerID="078ec507ce30283a0485a177e2cb5be3d3a2c1d53fd7eb04c7ac29c18047f218" exitCode=0 Oct 09 14:10:12 crc kubenswrapper[4902]: I1009 14:10:12.586026 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" event={"ID":"f2eeda14-398c-4a24-b25e-d7fd737e3e35","Type":"ContainerDied","Data":"078ec507ce30283a0485a177e2cb5be3d3a2c1d53fd7eb04c7ac29c18047f218"} Oct 09 14:10:12 crc kubenswrapper[4902]: I1009 14:10:12.588491 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef20e2e8-fcf0-438a-80a3-fd50db544b6e","Type":"ContainerStarted","Data":"c8c9d998d8d814a3300d12375610aba02055cfbd7f3c5ecb91490a90a0f2319c"} Oct 09 14:10:13 crc kubenswrapper[4902]: I1009 14:10:13.601165 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" event={"ID":"f2eeda14-398c-4a24-b25e-d7fd737e3e35","Type":"ContainerStarted","Data":"6aa4aabccc024501f5f0faac141f9778b7d0c301a35d08585bc38760c253883e"} Oct 09 14:10:13 crc kubenswrapper[4902]: I1009 14:10:13.627247 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" podStartSLOduration=3.627219893 podStartE2EDuration="3.627219893s" podCreationTimestamp="2025-10-09 14:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:10:13.623964587 +0000 UTC m=+1160.821823721" watchObservedRunningTime="2025-10-09 14:10:13.627219893 +0000 UTC m=+1160.825078987" Oct 09 14:10:14 crc kubenswrapper[4902]: I1009 14:10:14.610586 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.078152 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.078660 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.078701 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.079236 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.079287 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7" gracePeriod=600 Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.671996 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7" exitCode=0 Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.672060 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7"} Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.672568 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049"} Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.672599 4902 scope.go:117] "RemoveContainer" containerID="5284f8d311bb4c3e2f0e528d6bcb33bd4828ef1536e55afe39f7116e6e98c726" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.931589 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.988872 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:10:20 crc kubenswrapper[4902]: I1009 14:10:20.989174 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="dnsmasq-dns" containerID="cri-o://3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46" gracePeriod=10 Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.095450 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-8b4gd"] Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.097401 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.144284 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-8b4gd"] Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201253 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwg7x\" (UniqueName: \"kubernetes.io/projected/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-kube-api-access-jwg7x\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201313 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201339 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201591 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-config\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201772 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201966 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.201988 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304354 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304482 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304501 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304542 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwg7x\" (UniqueName: \"kubernetes.io/projected/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-kube-api-access-jwg7x\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304568 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304590 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.304704 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-config\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.305856 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-config\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.305894 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.306107 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.306630 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.306933 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.307235 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.328378 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwg7x\" (UniqueName: \"kubernetes.io/projected/bdf337ce-e7d5-4de9-acb8-a98a481a8ab3-kube-api-access-jwg7x\") pod \"dnsmasq-dns-cb6ffcf87-8b4gd\" (UID: \"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3\") " pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.476801 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.488001 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610392 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610634 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610693 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d92ls\" (UniqueName: \"kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610817 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610842 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.610898 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc\") pod \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\" (UID: \"7e21b8a0-0cbd-4672-9aab-e15ed5b44309\") " Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.620729 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls" (OuterVolumeSpecName: "kube-api-access-d92ls") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "kube-api-access-d92ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.666831 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.675064 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config" (OuterVolumeSpecName: "config") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.678147 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.679056 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.691127 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e21b8a0-0cbd-4672-9aab-e15ed5b44309" (UID: "7e21b8a0-0cbd-4672-9aab-e15ed5b44309"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.705873 4902 generic.go:334] "Generic (PLEG): container finished" podID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerID="3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46" exitCode=0 Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.705909 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" event={"ID":"7e21b8a0-0cbd-4672-9aab-e15ed5b44309","Type":"ContainerDied","Data":"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46"} Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.705931 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" event={"ID":"7e21b8a0-0cbd-4672-9aab-e15ed5b44309","Type":"ContainerDied","Data":"2d931e4e9e2a3cd7d154857e10ab610015dc3ca7065c43b559d6526cc0118af8"} Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.705955 4902 scope.go:117] "RemoveContainer" containerID="3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.705966 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713181 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713213 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713228 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713241 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d92ls\" (UniqueName: \"kubernetes.io/projected/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-kube-api-access-d92ls\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713256 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.713268 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e21b8a0-0cbd-4672-9aab-e15ed5b44309-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.728679 4902 scope.go:117] "RemoveContainer" containerID="e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.747489 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.754816 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-985vj"] Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.776048 4902 scope.go:117] "RemoveContainer" containerID="3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46" Oct 09 14:10:21 crc kubenswrapper[4902]: E1009 14:10:21.777266 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46\": container with ID starting with 3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46 not found: ID does not exist" containerID="3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.777308 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46"} err="failed to get container status \"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46\": rpc error: code = NotFound desc = could not find container \"3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46\": container with ID starting with 3881f757369f3cc3e6b3099a1a7ca2d9455656b0bd5c249e88a2980eed4e7f46 not found: ID does not exist" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.777329 4902 scope.go:117] "RemoveContainer" containerID="e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76" Oct 09 14:10:21 crc kubenswrapper[4902]: E1009 14:10:21.777697 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76\": container with ID starting with e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76 not found: ID does not exist" containerID="e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.777740 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76"} err="failed to get container status \"e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76\": rpc error: code = NotFound desc = could not find container \"e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76\": container with ID starting with e7a51b489f1b340ee23bcbea1c11898692d213374d0bae5bd4e8a2f41172ef76 not found: ID does not exist" Oct 09 14:10:21 crc kubenswrapper[4902]: I1009 14:10:21.940868 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-8b4gd"] Oct 09 14:10:21 crc kubenswrapper[4902]: W1009 14:10:21.942020 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf337ce_e7d5_4de9_acb8_a98a481a8ab3.slice/crio-b36f46b9e2610a0a263001a2c93164a1d35396779ba130b5c3bdee191c780803 WatchSource:0}: Error finding container b36f46b9e2610a0a263001a2c93164a1d35396779ba130b5c3bdee191c780803: Status 404 returned error can't find the container with id b36f46b9e2610a0a263001a2c93164a1d35396779ba130b5c3bdee191c780803 Oct 09 14:10:22 crc kubenswrapper[4902]: I1009 14:10:22.717239 4902 generic.go:334] "Generic (PLEG): container finished" podID="bdf337ce-e7d5-4de9-acb8-a98a481a8ab3" containerID="3fccd857a54a3673f034fd03e7e73b8a5313fb02826c39f1c4bf04b655c0b090" exitCode=0 Oct 09 14:10:22 crc kubenswrapper[4902]: I1009 14:10:22.717306 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" event={"ID":"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3","Type":"ContainerDied","Data":"3fccd857a54a3673f034fd03e7e73b8a5313fb02826c39f1c4bf04b655c0b090"} Oct 09 14:10:22 crc kubenswrapper[4902]: I1009 14:10:22.717531 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" event={"ID":"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3","Type":"ContainerStarted","Data":"b36f46b9e2610a0a263001a2c93164a1d35396779ba130b5c3bdee191c780803"} Oct 09 14:10:23 crc kubenswrapper[4902]: I1009 14:10:23.523734 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" path="/var/lib/kubelet/pods/7e21b8a0-0cbd-4672-9aab-e15ed5b44309/volumes" Oct 09 14:10:23 crc kubenswrapper[4902]: I1009 14:10:23.729310 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" event={"ID":"bdf337ce-e7d5-4de9-acb8-a98a481a8ab3","Type":"ContainerStarted","Data":"8288f90a06770b912cb58da703dcb04e79d9a57db195772be9c5a4ec21bf20ca"} Oct 09 14:10:23 crc kubenswrapper[4902]: I1009 14:10:23.729597 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:23 crc kubenswrapper[4902]: I1009 14:10:23.757177 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" podStartSLOduration=2.7571632729999997 podStartE2EDuration="2.757163273s" podCreationTimestamp="2025-10-09 14:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:10:23.751923827 +0000 UTC m=+1170.949782881" watchObservedRunningTime="2025-10-09 14:10:23.757163273 +0000 UTC m=+1170.955022337" Oct 09 14:10:26 crc kubenswrapper[4902]: I1009 14:10:26.368123 4902 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-985vj" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: i/o timeout" Oct 09 14:10:31 crc kubenswrapper[4902]: I1009 14:10:31.478521 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-8b4gd" Oct 09 14:10:31 crc kubenswrapper[4902]: I1009 14:10:31.552310 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:31 crc kubenswrapper[4902]: I1009 14:10:31.552562 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="dnsmasq-dns" containerID="cri-o://6aa4aabccc024501f5f0faac141f9778b7d0c301a35d08585bc38760c253883e" gracePeriod=10 Oct 09 14:10:31 crc kubenswrapper[4902]: I1009 14:10:31.809973 4902 generic.go:334] "Generic (PLEG): container finished" podID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerID="6aa4aabccc024501f5f0faac141f9778b7d0c301a35d08585bc38760c253883e" exitCode=0 Oct 09 14:10:31 crc kubenswrapper[4902]: I1009 14:10:31.810186 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" event={"ID":"f2eeda14-398c-4a24-b25e-d7fd737e3e35","Type":"ContainerDied","Data":"6aa4aabccc024501f5f0faac141f9778b7d0c301a35d08585bc38760c253883e"} Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.034819 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204498 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204577 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204611 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204688 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204783 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkmpc\" (UniqueName: \"kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204823 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.204927 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc\") pod \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\" (UID: \"f2eeda14-398c-4a24-b25e-d7fd737e3e35\") " Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.211885 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc" (OuterVolumeSpecName: "kube-api-access-fkmpc") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "kube-api-access-fkmpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.268077 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config" (OuterVolumeSpecName: "config") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.270212 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.273291 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.275494 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.276015 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.285526 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2eeda14-398c-4a24-b25e-d7fd737e3e35" (UID: "f2eeda14-398c-4a24-b25e-d7fd737e3e35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308775 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308824 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkmpc\" (UniqueName: \"kubernetes.io/projected/f2eeda14-398c-4a24-b25e-d7fd737e3e35-kube-api-access-fkmpc\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308883 4902 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308901 4902 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308913 4902 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308923 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.308934 4902 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2eeda14-398c-4a24-b25e-d7fd737e3e35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.820925 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" event={"ID":"f2eeda14-398c-4a24-b25e-d7fd737e3e35","Type":"ContainerDied","Data":"0865aabb9e0fa6a6d3b8e9762404e05ad2b32173e55a41f6f3488342c6b3e787"} Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.820985 4902 scope.go:117] "RemoveContainer" containerID="6aa4aabccc024501f5f0faac141f9778b7d0c301a35d08585bc38760c253883e" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.822209 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-7pvrm" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.843374 4902 scope.go:117] "RemoveContainer" containerID="078ec507ce30283a0485a177e2cb5be3d3a2c1d53fd7eb04c7ac29c18047f218" Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.872456 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:32 crc kubenswrapper[4902]: I1009 14:10:32.884906 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-7pvrm"] Oct 09 14:10:33 crc kubenswrapper[4902]: I1009 14:10:33.524669 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" path="/var/lib/kubelet/pods/f2eeda14-398c-4a24-b25e-d7fd737e3e35/volumes" Oct 09 14:10:42 crc kubenswrapper[4902]: I1009 14:10:42.917629 4902 generic.go:334] "Generic (PLEG): container finished" podID="f0f31142-f615-421c-a863-1603f1cb31a0" containerID="924c3047bfef6697c22357d2bf2c5a2fcd21530765d6086875037db86dd60f68" exitCode=0 Oct 09 14:10:42 crc kubenswrapper[4902]: I1009 14:10:42.917751 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0f31142-f615-421c-a863-1603f1cb31a0","Type":"ContainerDied","Data":"924c3047bfef6697c22357d2bf2c5a2fcd21530765d6086875037db86dd60f68"} Oct 09 14:10:43 crc kubenswrapper[4902]: I1009 14:10:43.928787 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f0f31142-f615-421c-a863-1603f1cb31a0","Type":"ContainerStarted","Data":"26ef37c82028e38a8664db695200ac468bffe8b38b2cd16c329430d427914512"} Oct 09 14:10:43 crc kubenswrapper[4902]: I1009 14:10:43.930284 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 14:10:43 crc kubenswrapper[4902]: I1009 14:10:43.961893 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.961872138 podStartE2EDuration="36.961872138s" podCreationTimestamp="2025-10-09 14:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:10:43.955036796 +0000 UTC m=+1191.152895860" watchObservedRunningTime="2025-10-09 14:10:43.961872138 +0000 UTC m=+1191.159731192" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.697889 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2"] Oct 09 14:10:44 crc kubenswrapper[4902]: E1009 14:10:44.698240 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="init" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698256 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="init" Oct 09 14:10:44 crc kubenswrapper[4902]: E1009 14:10:44.698270 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="init" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698277 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="init" Oct 09 14:10:44 crc kubenswrapper[4902]: E1009 14:10:44.698295 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698302 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: E1009 14:10:44.698317 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698323 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698538 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eeda14-398c-4a24-b25e-d7fd737e3e35" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.698569 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e21b8a0-0cbd-4672-9aab-e15ed5b44309" containerName="dnsmasq-dns" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.699138 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.701185 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.702288 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.702604 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.702947 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.733518 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2"] Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.850914 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.850989 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.851276 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.851452 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frd2z\" (UniqueName: \"kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.938602 4902 generic.go:334] "Generic (PLEG): container finished" podID="ef20e2e8-fcf0-438a-80a3-fd50db544b6e" containerID="c8c9d998d8d814a3300d12375610aba02055cfbd7f3c5ecb91490a90a0f2319c" exitCode=0 Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.938706 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef20e2e8-fcf0-438a-80a3-fd50db544b6e","Type":"ContainerDied","Data":"c8c9d998d8d814a3300d12375610aba02055cfbd7f3c5ecb91490a90a0f2319c"} Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.953227 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.953310 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frd2z\" (UniqueName: \"kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.953447 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.953493 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.957082 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.960216 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.961801 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:44 crc kubenswrapper[4902]: I1009 14:10:44.975884 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frd2z\" (UniqueName: \"kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.017892 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.643638 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.659804 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2"] Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.949991 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ef20e2e8-fcf0-438a-80a3-fd50db544b6e","Type":"ContainerStarted","Data":"54c2b9226eccd987fa80fc2e61de16df57e83925e0d6735ef512a0b5a6809790"} Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.950247 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.954673 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" event={"ID":"dc688ac8-1f96-4a97-adf2-151b28cca357","Type":"ContainerStarted","Data":"7df1de57911bb3df8548eaf3318e756d4eb1df879e77edcae2e8aa1bb8f03eec"} Oct 09 14:10:45 crc kubenswrapper[4902]: I1009 14:10:45.984804 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.984777895 podStartE2EDuration="36.984777895s" podCreationTimestamp="2025-10-09 14:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:10:45.971859202 +0000 UTC m=+1193.169718286" watchObservedRunningTime="2025-10-09 14:10:45.984777895 +0000 UTC m=+1193.182636969" Oct 09 14:10:54 crc kubenswrapper[4902]: I1009 14:10:54.895942 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:10:56 crc kubenswrapper[4902]: I1009 14:10:56.059503 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" event={"ID":"dc688ac8-1f96-4a97-adf2-151b28cca357","Type":"ContainerStarted","Data":"5e4a6fb2afd5a59c347c66c614f3ec577fab549d816da6a51978eeebf0755d9b"} Oct 09 14:10:56 crc kubenswrapper[4902]: I1009 14:10:56.078247 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" podStartSLOduration=2.827636313 podStartE2EDuration="12.078227354s" podCreationTimestamp="2025-10-09 14:10:44 +0000 UTC" firstStartedPulling="2025-10-09 14:10:45.643366288 +0000 UTC m=+1192.841225352" lastFinishedPulling="2025-10-09 14:10:54.893957329 +0000 UTC m=+1202.091816393" observedRunningTime="2025-10-09 14:10:56.07405368 +0000 UTC m=+1203.271912754" watchObservedRunningTime="2025-10-09 14:10:56.078227354 +0000 UTC m=+1203.276086418" Oct 09 14:10:58 crc kubenswrapper[4902]: I1009 14:10:58.211642 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 14:10:59 crc kubenswrapper[4902]: I1009 14:10:59.980904 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 14:11:07 crc kubenswrapper[4902]: I1009 14:11:07.169300 4902 generic.go:334] "Generic (PLEG): container finished" podID="dc688ac8-1f96-4a97-adf2-151b28cca357" containerID="5e4a6fb2afd5a59c347c66c614f3ec577fab549d816da6a51978eeebf0755d9b" exitCode=0 Oct 09 14:11:07 crc kubenswrapper[4902]: I1009 14:11:07.169559 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" event={"ID":"dc688ac8-1f96-4a97-adf2-151b28cca357","Type":"ContainerDied","Data":"5e4a6fb2afd5a59c347c66c614f3ec577fab549d816da6a51978eeebf0755d9b"} Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.562620 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.632602 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory\") pod \"dc688ac8-1f96-4a97-adf2-151b28cca357\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.632660 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle\") pod \"dc688ac8-1f96-4a97-adf2-151b28cca357\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.632747 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frd2z\" (UniqueName: \"kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z\") pod \"dc688ac8-1f96-4a97-adf2-151b28cca357\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.632890 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key\") pod \"dc688ac8-1f96-4a97-adf2-151b28cca357\" (UID: \"dc688ac8-1f96-4a97-adf2-151b28cca357\") " Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.642245 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "dc688ac8-1f96-4a97-adf2-151b28cca357" (UID: "dc688ac8-1f96-4a97-adf2-151b28cca357"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.674066 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z" (OuterVolumeSpecName: "kube-api-access-frd2z") pod "dc688ac8-1f96-4a97-adf2-151b28cca357" (UID: "dc688ac8-1f96-4a97-adf2-151b28cca357"). InnerVolumeSpecName "kube-api-access-frd2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.701597 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory" (OuterVolumeSpecName: "inventory") pod "dc688ac8-1f96-4a97-adf2-151b28cca357" (UID: "dc688ac8-1f96-4a97-adf2-151b28cca357"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.735736 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.735791 4902 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.735807 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frd2z\" (UniqueName: \"kubernetes.io/projected/dc688ac8-1f96-4a97-adf2-151b28cca357-kube-api-access-frd2z\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.798835 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc688ac8-1f96-4a97-adf2-151b28cca357" (UID: "dc688ac8-1f96-4a97-adf2-151b28cca357"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:11:08 crc kubenswrapper[4902]: I1009 14:11:08.837952 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc688ac8-1f96-4a97-adf2-151b28cca357-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.193490 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" event={"ID":"dc688ac8-1f96-4a97-adf2-151b28cca357","Type":"ContainerDied","Data":"7df1de57911bb3df8548eaf3318e756d4eb1df879e77edcae2e8aa1bb8f03eec"} Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.193792 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df1de57911bb3df8548eaf3318e756d4eb1df879e77edcae2e8aa1bb8f03eec" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.193558 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.273589 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w"] Oct 09 14:11:09 crc kubenswrapper[4902]: E1009 14:11:09.274078 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc688ac8-1f96-4a97-adf2-151b28cca357" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.274101 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc688ac8-1f96-4a97-adf2-151b28cca357" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.274342 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc688ac8-1f96-4a97-adf2-151b28cca357" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.275084 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.277366 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.277688 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.277755 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.282173 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.294420 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w"] Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.345562 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.345693 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6prj\" (UniqueName: \"kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.345758 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.447502 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6prj\" (UniqueName: \"kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.447583 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.447653 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.451484 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.451756 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.463620 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6prj\" (UniqueName: \"kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ghc9w\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:09 crc kubenswrapper[4902]: I1009 14:11:09.599794 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:10 crc kubenswrapper[4902]: I1009 14:11:10.116914 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w"] Oct 09 14:11:10 crc kubenswrapper[4902]: I1009 14:11:10.203028 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" event={"ID":"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a","Type":"ContainerStarted","Data":"be11c62c767e3f6d217caaadf6baa52a4d45cc8e04dfa28c75d97e2ff18fad91"} Oct 09 14:11:11 crc kubenswrapper[4902]: I1009 14:11:11.214153 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" event={"ID":"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a","Type":"ContainerStarted","Data":"84debeade34f6494d525147abf7c1ec410a31db8f38c8be5a42756c89b3fbdb5"} Oct 09 14:11:11 crc kubenswrapper[4902]: I1009 14:11:11.231547 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" podStartSLOduration=1.7130342509999998 podStartE2EDuration="2.231528737s" podCreationTimestamp="2025-10-09 14:11:09 +0000 UTC" firstStartedPulling="2025-10-09 14:11:10.121083639 +0000 UTC m=+1217.318942723" lastFinishedPulling="2025-10-09 14:11:10.639578145 +0000 UTC m=+1217.837437209" observedRunningTime="2025-10-09 14:11:11.230776824 +0000 UTC m=+1218.428635898" watchObservedRunningTime="2025-10-09 14:11:11.231528737 +0000 UTC m=+1218.429387791" Oct 09 14:11:14 crc kubenswrapper[4902]: I1009 14:11:14.242959 4902 generic.go:334] "Generic (PLEG): container finished" podID="2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" containerID="84debeade34f6494d525147abf7c1ec410a31db8f38c8be5a42756c89b3fbdb5" exitCode=0 Oct 09 14:11:14 crc kubenswrapper[4902]: I1009 14:11:14.243031 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" event={"ID":"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a","Type":"ContainerDied","Data":"84debeade34f6494d525147abf7c1ec410a31db8f38c8be5a42756c89b3fbdb5"} Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.629473 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.664166 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6prj\" (UniqueName: \"kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj\") pod \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.664788 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory\") pod \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.664822 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key\") pod \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\" (UID: \"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a\") " Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.670932 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj" (OuterVolumeSpecName: "kube-api-access-l6prj") pod "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" (UID: "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a"). InnerVolumeSpecName "kube-api-access-l6prj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.701458 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" (UID: "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.711137 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory" (OuterVolumeSpecName: "inventory") pod "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" (UID: "2cfa6eb9-6c46-4420-a165-d7a1c4d7713a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.767269 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6prj\" (UniqueName: \"kubernetes.io/projected/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-kube-api-access-l6prj\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.767298 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:15 crc kubenswrapper[4902]: I1009 14:11:15.767308 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cfa6eb9-6c46-4420-a165-d7a1c4d7713a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.263912 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" event={"ID":"2cfa6eb9-6c46-4420-a165-d7a1c4d7713a","Type":"ContainerDied","Data":"be11c62c767e3f6d217caaadf6baa52a4d45cc8e04dfa28c75d97e2ff18fad91"} Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.263967 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be11c62c767e3f6d217caaadf6baa52a4d45cc8e04dfa28c75d97e2ff18fad91" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.263974 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ghc9w" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.340165 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n"] Oct 09 14:11:16 crc kubenswrapper[4902]: E1009 14:11:16.340674 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.340702 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.341008 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfa6eb9-6c46-4420-a165-d7a1c4d7713a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.341814 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.343663 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.344183 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.344705 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.345349 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.352045 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n"] Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.378281 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.378334 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.378485 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.378561 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9gz\" (UniqueName: \"kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.480070 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.480178 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m9gz\" (UniqueName: \"kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.480246 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.480269 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.485842 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.493010 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.493281 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.500989 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m9gz\" (UniqueName: \"kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:16 crc kubenswrapper[4902]: I1009 14:11:16.661424 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:11:17 crc kubenswrapper[4902]: I1009 14:11:17.170933 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n"] Oct 09 14:11:17 crc kubenswrapper[4902]: I1009 14:11:17.291365 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" event={"ID":"705cf92b-1b0d-4706-bf30-03fb1a9728cd","Type":"ContainerStarted","Data":"de72b74efcbfdfc125bb4179d6252d279537382234ef9145bb9c503b3d5c6480"} Oct 09 14:11:18 crc kubenswrapper[4902]: I1009 14:11:18.301879 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" event={"ID":"705cf92b-1b0d-4706-bf30-03fb1a9728cd","Type":"ContainerStarted","Data":"5c6b29ad409a3941e111c97a7abb45330c29133de24fa21b684d7e8ed2764c9e"} Oct 09 14:11:18 crc kubenswrapper[4902]: I1009 14:11:18.336384 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" podStartSLOduration=1.886879951 podStartE2EDuration="2.336360521s" podCreationTimestamp="2025-10-09 14:11:16 +0000 UTC" firstStartedPulling="2025-10-09 14:11:17.177700785 +0000 UTC m=+1224.375559849" lastFinishedPulling="2025-10-09 14:11:17.627181345 +0000 UTC m=+1224.825040419" observedRunningTime="2025-10-09 14:11:18.321632605 +0000 UTC m=+1225.519491679" watchObservedRunningTime="2025-10-09 14:11:18.336360521 +0000 UTC m=+1225.534219605" Oct 09 14:12:02 crc kubenswrapper[4902]: I1009 14:12:02.091398 4902 scope.go:117] "RemoveContainer" containerID="3d9139f332643896c9a5dfb4564992a3f14e8a4a98d85e66efe7c26be95cfe1f" Oct 09 14:12:02 crc kubenswrapper[4902]: I1009 14:12:02.114319 4902 scope.go:117] "RemoveContainer" containerID="94da1deaed671e28bce767f1e58bd75e41d9ad66cb40063d63651bf975d4bfb3" Oct 09 14:12:20 crc kubenswrapper[4902]: I1009 14:12:20.077945 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:12:20 crc kubenswrapper[4902]: I1009 14:12:20.078584 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:12:50 crc kubenswrapper[4902]: I1009 14:12:50.078313 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:12:50 crc kubenswrapper[4902]: I1009 14:12:50.078886 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.741466 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.744966 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.755766 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.816492 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jttzr\" (UniqueName: \"kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.816568 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.816653 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.917709 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.917787 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.917895 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jttzr\" (UniqueName: \"kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.918261 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.918298 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:58 crc kubenswrapper[4902]: I1009 14:12:58.938355 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jttzr\" (UniqueName: \"kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr\") pod \"community-operators-h5s8j\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:59 crc kubenswrapper[4902]: I1009 14:12:59.126545 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:12:59 crc kubenswrapper[4902]: I1009 14:12:59.603537 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:13:00 crc kubenswrapper[4902]: I1009 14:13:00.235368 4902 generic.go:334] "Generic (PLEG): container finished" podID="e6584d0b-96a9-441c-a14a-04d929731e99" containerID="e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca" exitCode=0 Oct 09 14:13:00 crc kubenswrapper[4902]: I1009 14:13:00.235504 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerDied","Data":"e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca"} Oct 09 14:13:00 crc kubenswrapper[4902]: I1009 14:13:00.235678 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerStarted","Data":"25ea537b9e84c72a2fb123b190ba0525d4c636d2425ba88a70c596d57e6b11e1"} Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.190363 4902 scope.go:117] "RemoveContainer" containerID="f2275601f7eb23d0a2ed3839e6ea247d201a99601b631a0bde47632a7411154b" Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.222403 4902 scope.go:117] "RemoveContainer" containerID="09bbb399871ad74fe356399ef81c235f3d3e879f2efbc76a83d8ffb879158359" Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.255493 4902 scope.go:117] "RemoveContainer" containerID="112d547199273642f0137b5c314d1495df860c7a46c5089daf5caaba0a916bdd" Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.257776 4902 generic.go:334] "Generic (PLEG): container finished" podID="e6584d0b-96a9-441c-a14a-04d929731e99" containerID="bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e" exitCode=0 Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.257813 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerDied","Data":"bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e"} Oct 09 14:13:02 crc kubenswrapper[4902]: I1009 14:13:02.417587 4902 scope.go:117] "RemoveContainer" containerID="7648107539a17a6742111598981f92afad08f59daebf801ec90d14ee396c2bc4" Oct 09 14:13:03 crc kubenswrapper[4902]: I1009 14:13:03.271037 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerStarted","Data":"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9"} Oct 09 14:13:03 crc kubenswrapper[4902]: I1009 14:13:03.296431 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5s8j" podStartSLOduration=2.738129602 podStartE2EDuration="5.296398274s" podCreationTimestamp="2025-10-09 14:12:58 +0000 UTC" firstStartedPulling="2025-10-09 14:13:00.236960378 +0000 UTC m=+1327.434819442" lastFinishedPulling="2025-10-09 14:13:02.79522905 +0000 UTC m=+1329.993088114" observedRunningTime="2025-10-09 14:13:03.28836234 +0000 UTC m=+1330.486221404" watchObservedRunningTime="2025-10-09 14:13:03.296398274 +0000 UTC m=+1330.494257338" Oct 09 14:13:09 crc kubenswrapper[4902]: I1009 14:13:09.127823 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:09 crc kubenswrapper[4902]: I1009 14:13:09.128442 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:09 crc kubenswrapper[4902]: I1009 14:13:09.192275 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:09 crc kubenswrapper[4902]: I1009 14:13:09.370860 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:09 crc kubenswrapper[4902]: I1009 14:13:09.434121 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.351659 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5s8j" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="registry-server" containerID="cri-o://e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9" gracePeriod=2 Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.823886 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.960319 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities\") pod \"e6584d0b-96a9-441c-a14a-04d929731e99\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.960568 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jttzr\" (UniqueName: \"kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr\") pod \"e6584d0b-96a9-441c-a14a-04d929731e99\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.960740 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content\") pod \"e6584d0b-96a9-441c-a14a-04d929731e99\" (UID: \"e6584d0b-96a9-441c-a14a-04d929731e99\") " Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.961203 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities" (OuterVolumeSpecName: "utilities") pod "e6584d0b-96a9-441c-a14a-04d929731e99" (UID: "e6584d0b-96a9-441c-a14a-04d929731e99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.961495 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:11 crc kubenswrapper[4902]: I1009 14:13:11.971900 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr" (OuterVolumeSpecName: "kube-api-access-jttzr") pod "e6584d0b-96a9-441c-a14a-04d929731e99" (UID: "e6584d0b-96a9-441c-a14a-04d929731e99"). InnerVolumeSpecName "kube-api-access-jttzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.016266 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6584d0b-96a9-441c-a14a-04d929731e99" (UID: "e6584d0b-96a9-441c-a14a-04d929731e99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.063451 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jttzr\" (UniqueName: \"kubernetes.io/projected/e6584d0b-96a9-441c-a14a-04d929731e99-kube-api-access-jttzr\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.063503 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6584d0b-96a9-441c-a14a-04d929731e99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.362635 4902 generic.go:334] "Generic (PLEG): container finished" podID="e6584d0b-96a9-441c-a14a-04d929731e99" containerID="e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9" exitCode=0 Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.362683 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerDied","Data":"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9"} Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.362719 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5s8j" event={"ID":"e6584d0b-96a9-441c-a14a-04d929731e99","Type":"ContainerDied","Data":"25ea537b9e84c72a2fb123b190ba0525d4c636d2425ba88a70c596d57e6b11e1"} Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.362749 4902 scope.go:117] "RemoveContainer" containerID="e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.363651 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5s8j" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.385500 4902 scope.go:117] "RemoveContainer" containerID="bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.402624 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.416886 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5s8j"] Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.426065 4902 scope.go:117] "RemoveContainer" containerID="e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.450471 4902 scope.go:117] "RemoveContainer" containerID="e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9" Oct 09 14:13:12 crc kubenswrapper[4902]: E1009 14:13:12.451315 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9\": container with ID starting with e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9 not found: ID does not exist" containerID="e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.451387 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9"} err="failed to get container status \"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9\": rpc error: code = NotFound desc = could not find container \"e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9\": container with ID starting with e94112b6c448815478ea136a926190977c48ce8c528a81f80dbc04b4223691f9 not found: ID does not exist" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.451451 4902 scope.go:117] "RemoveContainer" containerID="bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e" Oct 09 14:13:12 crc kubenswrapper[4902]: E1009 14:13:12.451952 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e\": container with ID starting with bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e not found: ID does not exist" containerID="bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.452004 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e"} err="failed to get container status \"bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e\": rpc error: code = NotFound desc = could not find container \"bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e\": container with ID starting with bbb2ddad941d46b5786d18330242d5640a5c042f6515fdf99a7c5bf5a797b78e not found: ID does not exist" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.452043 4902 scope.go:117] "RemoveContainer" containerID="e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca" Oct 09 14:13:12 crc kubenswrapper[4902]: E1009 14:13:12.452544 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca\": container with ID starting with e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca not found: ID does not exist" containerID="e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca" Oct 09 14:13:12 crc kubenswrapper[4902]: I1009 14:13:12.452629 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca"} err="failed to get container status \"e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca\": rpc error: code = NotFound desc = could not find container \"e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca\": container with ID starting with e2356ffb4248bcc79f270e30cbba8a45b6b459aaf75dfe3f19f55e5e7111fcca not found: ID does not exist" Oct 09 14:13:13 crc kubenswrapper[4902]: I1009 14:13:13.523595 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" path="/var/lib/kubelet/pods/e6584d0b-96a9-441c-a14a-04d929731e99/volumes" Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.078925 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.079832 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.079899 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.081126 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.081247 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049" gracePeriod=600 Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.450920 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049" exitCode=0 Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.451183 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049"} Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.451330 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45"} Oct 09 14:13:20 crc kubenswrapper[4902]: I1009 14:13:20.451366 4902 scope.go:117] "RemoveContainer" containerID="269c5fb340ffd90e9e77aaecfbd73abe66b2736b0d8ca2d63ca9d236f4e7c4a7" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.132551 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:28 crc kubenswrapper[4902]: E1009 14:13:28.133710 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="extract-utilities" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.133729 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="extract-utilities" Oct 09 14:13:28 crc kubenswrapper[4902]: E1009 14:13:28.133769 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="registry-server" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.133777 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="registry-server" Oct 09 14:13:28 crc kubenswrapper[4902]: E1009 14:13:28.133798 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="extract-content" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.133807 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="extract-content" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.134040 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6584d0b-96a9-441c-a14a-04d929731e99" containerName="registry-server" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.135558 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.143566 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.322769 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.322847 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.322893 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86ln\" (UniqueName: \"kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.424572 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.424635 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.424675 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86ln\" (UniqueName: \"kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.425679 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.425942 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.447933 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86ln\" (UniqueName: \"kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln\") pod \"certified-operators-74d5b\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:28 crc kubenswrapper[4902]: I1009 14:13:28.472083 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:29 crc kubenswrapper[4902]: I1009 14:13:29.016004 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:29 crc kubenswrapper[4902]: I1009 14:13:29.539772 4902 generic.go:334] "Generic (PLEG): container finished" podID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerID="c615c9200759acdf168fa53e674c2bf235823ac599e300fee13d691a4713f74e" exitCode=0 Oct 09 14:13:29 crc kubenswrapper[4902]: I1009 14:13:29.539821 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerDied","Data":"c615c9200759acdf168fa53e674c2bf235823ac599e300fee13d691a4713f74e"} Oct 09 14:13:29 crc kubenswrapper[4902]: I1009 14:13:29.540074 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerStarted","Data":"051c0c8b3fc064ca581ea556151ba9581ea3505ae67e4b415aeba7a38487c9d5"} Oct 09 14:13:30 crc kubenswrapper[4902]: I1009 14:13:30.555104 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerStarted","Data":"9f232dfe14da165dca40a4e6796be4f2c9f665795656b16aaef9d7345c055418"} Oct 09 14:13:31 crc kubenswrapper[4902]: I1009 14:13:31.571815 4902 generic.go:334] "Generic (PLEG): container finished" podID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerID="9f232dfe14da165dca40a4e6796be4f2c9f665795656b16aaef9d7345c055418" exitCode=0 Oct 09 14:13:31 crc kubenswrapper[4902]: I1009 14:13:31.571918 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerDied","Data":"9f232dfe14da165dca40a4e6796be4f2c9f665795656b16aaef9d7345c055418"} Oct 09 14:13:32 crc kubenswrapper[4902]: I1009 14:13:32.583338 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerStarted","Data":"a2a1106dd4d502824d85ad619dd268c468c2a69d7b2d23c616f3f979401913d0"} Oct 09 14:13:32 crc kubenswrapper[4902]: I1009 14:13:32.603110 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74d5b" podStartSLOduration=2.089139293 podStartE2EDuration="4.603091907s" podCreationTimestamp="2025-10-09 14:13:28 +0000 UTC" firstStartedPulling="2025-10-09 14:13:29.54259608 +0000 UTC m=+1356.740455144" lastFinishedPulling="2025-10-09 14:13:32.056548694 +0000 UTC m=+1359.254407758" observedRunningTime="2025-10-09 14:13:32.59767781 +0000 UTC m=+1359.795536874" watchObservedRunningTime="2025-10-09 14:13:32.603091907 +0000 UTC m=+1359.800950971" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.503042 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.509509 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.517896 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.655940 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.656004 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.656143 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgqg\" (UniqueName: \"kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.758263 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.758320 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.758398 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qgqg\" (UniqueName: \"kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.758792 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.758922 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.776752 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qgqg\" (UniqueName: \"kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg\") pod \"redhat-marketplace-pgt9b\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:34 crc kubenswrapper[4902]: I1009 14:13:34.841606 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:35 crc kubenswrapper[4902]: I1009 14:13:35.289364 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:35 crc kubenswrapper[4902]: W1009 14:13:35.302688 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f28701_71fd_49c4_8c07_760371aea50f.slice/crio-15f555b1055ad7062b3059b3abca093e3b6f89aece1a25cf48ae5dbd0d06e497 WatchSource:0}: Error finding container 15f555b1055ad7062b3059b3abca093e3b6f89aece1a25cf48ae5dbd0d06e497: Status 404 returned error can't find the container with id 15f555b1055ad7062b3059b3abca093e3b6f89aece1a25cf48ae5dbd0d06e497 Oct 09 14:13:35 crc kubenswrapper[4902]: I1009 14:13:35.610806 4902 generic.go:334] "Generic (PLEG): container finished" podID="b4f28701-71fd-49c4-8c07-760371aea50f" containerID="2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974" exitCode=0 Oct 09 14:13:35 crc kubenswrapper[4902]: I1009 14:13:35.611044 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerDied","Data":"2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974"} Oct 09 14:13:35 crc kubenswrapper[4902]: I1009 14:13:35.611079 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerStarted","Data":"15f555b1055ad7062b3059b3abca093e3b6f89aece1a25cf48ae5dbd0d06e497"} Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.708255 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.711712 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.716853 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.802958 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgp5\" (UniqueName: \"kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.803258 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.803501 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.905166 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.905777 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.905877 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgp5\" (UniqueName: \"kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.905654 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.906011 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:36 crc kubenswrapper[4902]: I1009 14:13:36.926643 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgp5\" (UniqueName: \"kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5\") pod \"redhat-operators-vb5rd\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:37 crc kubenswrapper[4902]: I1009 14:13:37.094753 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:37 crc kubenswrapper[4902]: W1009 14:13:37.574053 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57fe1c8_a794_46a8_97a8_ed76be62f12a.slice/crio-113cc6c5b96c22bda75e0c76a706dde17f91f92b17cf24f064be6485d55d3658 WatchSource:0}: Error finding container 113cc6c5b96c22bda75e0c76a706dde17f91f92b17cf24f064be6485d55d3658: Status 404 returned error can't find the container with id 113cc6c5b96c22bda75e0c76a706dde17f91f92b17cf24f064be6485d55d3658 Oct 09 14:13:37 crc kubenswrapper[4902]: I1009 14:13:37.578474 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:13:37 crc kubenswrapper[4902]: I1009 14:13:37.630663 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerStarted","Data":"113cc6c5b96c22bda75e0c76a706dde17f91f92b17cf24f064be6485d55d3658"} Oct 09 14:13:37 crc kubenswrapper[4902]: I1009 14:13:37.636533 4902 generic.go:334] "Generic (PLEG): container finished" podID="b4f28701-71fd-49c4-8c07-760371aea50f" containerID="36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187" exitCode=0 Oct 09 14:13:37 crc kubenswrapper[4902]: I1009 14:13:37.636570 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerDied","Data":"36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187"} Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.473002 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.473598 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.551247 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.649154 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerStarted","Data":"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac"} Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.653345 4902 generic.go:334] "Generic (PLEG): container finished" podID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerID="ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176" exitCode=0 Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.654456 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerDied","Data":"ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176"} Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.677910 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pgt9b" podStartSLOduration=2.170910215 podStartE2EDuration="4.67788807s" podCreationTimestamp="2025-10-09 14:13:34 +0000 UTC" firstStartedPulling="2025-10-09 14:13:35.615271881 +0000 UTC m=+1362.813130945" lastFinishedPulling="2025-10-09 14:13:38.122249736 +0000 UTC m=+1365.320108800" observedRunningTime="2025-10-09 14:13:38.66811404 +0000 UTC m=+1365.865973104" watchObservedRunningTime="2025-10-09 14:13:38.67788807 +0000 UTC m=+1365.875747134" Oct 09 14:13:38 crc kubenswrapper[4902]: I1009 14:13:38.707649 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:40 crc kubenswrapper[4902]: I1009 14:13:40.692349 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerStarted","Data":"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d"} Oct 09 14:13:41 crc kubenswrapper[4902]: I1009 14:13:41.098436 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:41 crc kubenswrapper[4902]: I1009 14:13:41.099042 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74d5b" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="registry-server" containerID="cri-o://a2a1106dd4d502824d85ad619dd268c468c2a69d7b2d23c616f3f979401913d0" gracePeriod=2 Oct 09 14:13:42 crc kubenswrapper[4902]: I1009 14:13:42.711691 4902 generic.go:334] "Generic (PLEG): container finished" podID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerID="40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d" exitCode=0 Oct 09 14:13:42 crc kubenswrapper[4902]: I1009 14:13:42.711778 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerDied","Data":"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d"} Oct 09 14:13:43 crc kubenswrapper[4902]: I1009 14:13:43.728175 4902 generic.go:334] "Generic (PLEG): container finished" podID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerID="a2a1106dd4d502824d85ad619dd268c468c2a69d7b2d23c616f3f979401913d0" exitCode=0 Oct 09 14:13:43 crc kubenswrapper[4902]: I1009 14:13:43.728387 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerDied","Data":"a2a1106dd4d502824d85ad619dd268c468c2a69d7b2d23c616f3f979401913d0"} Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.263198 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.349464 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities\") pod \"cdc870f9-048c-40f8-902b-cd20b49b584c\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.349584 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86ln\" (UniqueName: \"kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln\") pod \"cdc870f9-048c-40f8-902b-cd20b49b584c\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.349608 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content\") pod \"cdc870f9-048c-40f8-902b-cd20b49b584c\" (UID: \"cdc870f9-048c-40f8-902b-cd20b49b584c\") " Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.350309 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities" (OuterVolumeSpecName: "utilities") pod "cdc870f9-048c-40f8-902b-cd20b49b584c" (UID: "cdc870f9-048c-40f8-902b-cd20b49b584c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.358693 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln" (OuterVolumeSpecName: "kube-api-access-s86ln") pod "cdc870f9-048c-40f8-902b-cd20b49b584c" (UID: "cdc870f9-048c-40f8-902b-cd20b49b584c"). InnerVolumeSpecName "kube-api-access-s86ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.400866 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdc870f9-048c-40f8-902b-cd20b49b584c" (UID: "cdc870f9-048c-40f8-902b-cd20b49b584c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.452067 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.452128 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86ln\" (UniqueName: \"kubernetes.io/projected/cdc870f9-048c-40f8-902b-cd20b49b584c-kube-api-access-s86ln\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.452141 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc870f9-048c-40f8-902b-cd20b49b584c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.740911 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerStarted","Data":"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d"} Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.744613 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74d5b" event={"ID":"cdc870f9-048c-40f8-902b-cd20b49b584c","Type":"ContainerDied","Data":"051c0c8b3fc064ca581ea556151ba9581ea3505ae67e4b415aeba7a38487c9d5"} Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.744658 4902 scope.go:117] "RemoveContainer" containerID="a2a1106dd4d502824d85ad619dd268c468c2a69d7b2d23c616f3f979401913d0" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.744796 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74d5b" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.777013 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vb5rd" podStartSLOduration=3.901349589 podStartE2EDuration="8.776987651s" podCreationTimestamp="2025-10-09 14:13:36 +0000 UTC" firstStartedPulling="2025-10-09 14:13:38.657536578 +0000 UTC m=+1365.855395642" lastFinishedPulling="2025-10-09 14:13:43.53317464 +0000 UTC m=+1370.731033704" observedRunningTime="2025-10-09 14:13:44.765296337 +0000 UTC m=+1371.963155411" watchObservedRunningTime="2025-10-09 14:13:44.776987651 +0000 UTC m=+1371.974846715" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.784229 4902 scope.go:117] "RemoveContainer" containerID="9f232dfe14da165dca40a4e6796be4f2c9f665795656b16aaef9d7345c055418" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.792307 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.802058 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74d5b"] Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.815382 4902 scope.go:117] "RemoveContainer" containerID="c615c9200759acdf168fa53e674c2bf235823ac599e300fee13d691a4713f74e" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.842374 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.842849 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:44 crc kubenswrapper[4902]: I1009 14:13:44.902491 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:45 crc kubenswrapper[4902]: I1009 14:13:45.524390 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" path="/var/lib/kubelet/pods/cdc870f9-048c-40f8-902b-cd20b49b584c/volumes" Oct 09 14:13:45 crc kubenswrapper[4902]: I1009 14:13:45.800696 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:47 crc kubenswrapper[4902]: I1009 14:13:47.095536 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:47 crc kubenswrapper[4902]: I1009 14:13:47.095828 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:47 crc kubenswrapper[4902]: I1009 14:13:47.500623 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:48 crc kubenswrapper[4902]: I1009 14:13:48.149449 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vb5rd" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="registry-server" probeResult="failure" output=< Oct 09 14:13:48 crc kubenswrapper[4902]: timeout: failed to connect service ":50051" within 1s Oct 09 14:13:48 crc kubenswrapper[4902]: > Oct 09 14:13:48 crc kubenswrapper[4902]: I1009 14:13:48.779354 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pgt9b" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="registry-server" containerID="cri-o://9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac" gracePeriod=2 Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.282217 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.445765 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities\") pod \"b4f28701-71fd-49c4-8c07-760371aea50f\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.446018 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qgqg\" (UniqueName: \"kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg\") pod \"b4f28701-71fd-49c4-8c07-760371aea50f\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.446090 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content\") pod \"b4f28701-71fd-49c4-8c07-760371aea50f\" (UID: \"b4f28701-71fd-49c4-8c07-760371aea50f\") " Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.446971 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities" (OuterVolumeSpecName: "utilities") pod "b4f28701-71fd-49c4-8c07-760371aea50f" (UID: "b4f28701-71fd-49c4-8c07-760371aea50f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.451573 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg" (OuterVolumeSpecName: "kube-api-access-6qgqg") pod "b4f28701-71fd-49c4-8c07-760371aea50f" (UID: "b4f28701-71fd-49c4-8c07-760371aea50f"). InnerVolumeSpecName "kube-api-access-6qgqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.460663 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4f28701-71fd-49c4-8c07-760371aea50f" (UID: "b4f28701-71fd-49c4-8c07-760371aea50f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.549227 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.549304 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4f28701-71fd-49c4-8c07-760371aea50f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.549321 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qgqg\" (UniqueName: \"kubernetes.io/projected/b4f28701-71fd-49c4-8c07-760371aea50f-kube-api-access-6qgqg\") on node \"crc\" DevicePath \"\"" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.811560 4902 generic.go:334] "Generic (PLEG): container finished" podID="b4f28701-71fd-49c4-8c07-760371aea50f" containerID="9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac" exitCode=0 Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.811663 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pgt9b" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.811654 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerDied","Data":"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac"} Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.811817 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pgt9b" event={"ID":"b4f28701-71fd-49c4-8c07-760371aea50f","Type":"ContainerDied","Data":"15f555b1055ad7062b3059b3abca093e3b6f89aece1a25cf48ae5dbd0d06e497"} Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.811840 4902 scope.go:117] "RemoveContainer" containerID="9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.841611 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.845491 4902 scope.go:117] "RemoveContainer" containerID="36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.849001 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pgt9b"] Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.867901 4902 scope.go:117] "RemoveContainer" containerID="2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.917014 4902 scope.go:117] "RemoveContainer" containerID="9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac" Oct 09 14:13:49 crc kubenswrapper[4902]: E1009 14:13:49.917621 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac\": container with ID starting with 9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac not found: ID does not exist" containerID="9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.917658 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac"} err="failed to get container status \"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac\": rpc error: code = NotFound desc = could not find container \"9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac\": container with ID starting with 9d1e099b5da28faa6f652899f589897eec0e4ec81d89add01ada1df8308ef6ac not found: ID does not exist" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.917681 4902 scope.go:117] "RemoveContainer" containerID="36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187" Oct 09 14:13:49 crc kubenswrapper[4902]: E1009 14:13:49.918104 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187\": container with ID starting with 36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187 not found: ID does not exist" containerID="36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.918147 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187"} err="failed to get container status \"36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187\": rpc error: code = NotFound desc = could not find container \"36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187\": container with ID starting with 36a2d1947e469f58144860043420d26bcb9425ab9e4e3e2d990ea06c73eed187 not found: ID does not exist" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.918174 4902 scope.go:117] "RemoveContainer" containerID="2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974" Oct 09 14:13:49 crc kubenswrapper[4902]: E1009 14:13:49.918624 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974\": container with ID starting with 2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974 not found: ID does not exist" containerID="2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974" Oct 09 14:13:49 crc kubenswrapper[4902]: I1009 14:13:49.918660 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974"} err="failed to get container status \"2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974\": rpc error: code = NotFound desc = could not find container \"2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974\": container with ID starting with 2a7c0651ee3c68003e11920397e918f1d23388005543970c5b12c1b670527974 not found: ID does not exist" Oct 09 14:13:51 crc kubenswrapper[4902]: I1009 14:13:51.528658 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" path="/var/lib/kubelet/pods/b4f28701-71fd-49c4-8c07-760371aea50f/volumes" Oct 09 14:13:57 crc kubenswrapper[4902]: I1009 14:13:57.146232 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:57 crc kubenswrapper[4902]: I1009 14:13:57.203810 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:57 crc kubenswrapper[4902]: I1009 14:13:57.382323 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:13:58 crc kubenswrapper[4902]: I1009 14:13:58.894784 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vb5rd" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="registry-server" containerID="cri-o://63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d" gracePeriod=2 Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.841313 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.906765 4902 generic.go:334] "Generic (PLEG): container finished" podID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerID="63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d" exitCode=0 Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.906804 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vb5rd" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.906815 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerDied","Data":"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d"} Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.906842 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vb5rd" event={"ID":"e57fe1c8-a794-46a8-97a8-ed76be62f12a","Type":"ContainerDied","Data":"113cc6c5b96c22bda75e0c76a706dde17f91f92b17cf24f064be6485d55d3658"} Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.906861 4902 scope.go:117] "RemoveContainer" containerID="63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.925740 4902 scope.go:117] "RemoveContainer" containerID="40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.939046 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzgp5\" (UniqueName: \"kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5\") pod \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.939322 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") pod \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.939384 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities\") pod \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.940544 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities" (OuterVolumeSpecName: "utilities") pod "e57fe1c8-a794-46a8-97a8-ed76be62f12a" (UID: "e57fe1c8-a794-46a8-97a8-ed76be62f12a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.945001 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5" (OuterVolumeSpecName: "kube-api-access-vzgp5") pod "e57fe1c8-a794-46a8-97a8-ed76be62f12a" (UID: "e57fe1c8-a794-46a8-97a8-ed76be62f12a"). InnerVolumeSpecName "kube-api-access-vzgp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:13:59 crc kubenswrapper[4902]: I1009 14:13:59.949221 4902 scope.go:117] "RemoveContainer" containerID="ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.039330 4902 scope.go:117] "RemoveContainer" containerID="63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d" Oct 09 14:14:00 crc kubenswrapper[4902]: E1009 14:14:00.039960 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d\": container with ID starting with 63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d not found: ID does not exist" containerID="63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040005 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d"} err="failed to get container status \"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d\": rpc error: code = NotFound desc = could not find container \"63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d\": container with ID starting with 63842b0056c958f5377fa7f254cd1cdda39b64d5dc8282a94da3e5b4a3cca46d not found: ID does not exist" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040032 4902 scope.go:117] "RemoveContainer" containerID="40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040208 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e57fe1c8-a794-46a8-97a8-ed76be62f12a" (UID: "e57fe1c8-a794-46a8-97a8-ed76be62f12a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:14:00 crc kubenswrapper[4902]: E1009 14:14:00.040608 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d\": container with ID starting with 40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d not found: ID does not exist" containerID="40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040668 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") pod \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\" (UID: \"e57fe1c8-a794-46a8-97a8-ed76be62f12a\") " Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040666 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d"} err="failed to get container status \"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d\": rpc error: code = NotFound desc = could not find container \"40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d\": container with ID starting with 40e97616bfa4f3805b22a74b6461e3b79fc2baf9e9e13b79dcdb273088aa2b5d not found: ID does not exist" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040704 4902 scope.go:117] "RemoveContainer" containerID="ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176" Oct 09 14:14:00 crc kubenswrapper[4902]: W1009 14:14:00.040798 4902 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e57fe1c8-a794-46a8-97a8-ed76be62f12a/volumes/kubernetes.io~empty-dir/catalog-content Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.040820 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e57fe1c8-a794-46a8-97a8-ed76be62f12a" (UID: "e57fe1c8-a794-46a8-97a8-ed76be62f12a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:14:00 crc kubenswrapper[4902]: E1009 14:14:00.041280 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176\": container with ID starting with ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176 not found: ID does not exist" containerID="ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.041364 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176"} err="failed to get container status \"ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176\": rpc error: code = NotFound desc = could not find container \"ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176\": container with ID starting with ee90804f70bbf32b62197e529cf525dedbb3263f9f921bccd9f9c9ee5f498176 not found: ID does not exist" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.041378 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzgp5\" (UniqueName: \"kubernetes.io/projected/e57fe1c8-a794-46a8-97a8-ed76be62f12a-kube-api-access-vzgp5\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.041405 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.041433 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57fe1c8-a794-46a8-97a8-ed76be62f12a-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.246681 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:14:00 crc kubenswrapper[4902]: I1009 14:14:00.254194 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vb5rd"] Oct 09 14:14:01 crc kubenswrapper[4902]: I1009 14:14:01.524587 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" path="/var/lib/kubelet/pods/e57fe1c8-a794-46a8-97a8-ed76be62f12a/volumes" Oct 09 14:14:44 crc kubenswrapper[4902]: I1009 14:14:44.319833 4902 generic.go:334] "Generic (PLEG): container finished" podID="705cf92b-1b0d-4706-bf30-03fb1a9728cd" containerID="5c6b29ad409a3941e111c97a7abb45330c29133de24fa21b684d7e8ed2764c9e" exitCode=0 Oct 09 14:14:44 crc kubenswrapper[4902]: I1009 14:14:44.319892 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" event={"ID":"705cf92b-1b0d-4706-bf30-03fb1a9728cd","Type":"ContainerDied","Data":"5c6b29ad409a3941e111c97a7abb45330c29133de24fa21b684d7e8ed2764c9e"} Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.741804 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.799261 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory\") pod \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.799661 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key\") pod \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.799722 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m9gz\" (UniqueName: \"kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz\") pod \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.799840 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle\") pod \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\" (UID: \"705cf92b-1b0d-4706-bf30-03fb1a9728cd\") " Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.805443 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "705cf92b-1b0d-4706-bf30-03fb1a9728cd" (UID: "705cf92b-1b0d-4706-bf30-03fb1a9728cd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.806039 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz" (OuterVolumeSpecName: "kube-api-access-8m9gz") pod "705cf92b-1b0d-4706-bf30-03fb1a9728cd" (UID: "705cf92b-1b0d-4706-bf30-03fb1a9728cd"). InnerVolumeSpecName "kube-api-access-8m9gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.827796 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "705cf92b-1b0d-4706-bf30-03fb1a9728cd" (UID: "705cf92b-1b0d-4706-bf30-03fb1a9728cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.830461 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory" (OuterVolumeSpecName: "inventory") pod "705cf92b-1b0d-4706-bf30-03fb1a9728cd" (UID: "705cf92b-1b0d-4706-bf30-03fb1a9728cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.902157 4902 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.902206 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.902221 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/705cf92b-1b0d-4706-bf30-03fb1a9728cd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:45 crc kubenswrapper[4902]: I1009 14:14:45.902233 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m9gz\" (UniqueName: \"kubernetes.io/projected/705cf92b-1b0d-4706-bf30-03fb1a9728cd-kube-api-access-8m9gz\") on node \"crc\" DevicePath \"\"" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.342210 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" event={"ID":"705cf92b-1b0d-4706-bf30-03fb1a9728cd","Type":"ContainerDied","Data":"de72b74efcbfdfc125bb4179d6252d279537382234ef9145bb9c503b3d5c6480"} Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.342258 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de72b74efcbfdfc125bb4179d6252d279537382234ef9145bb9c503b3d5c6480" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.342291 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.417883 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8"] Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418506 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418532 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418557 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418565 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418587 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418595 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418606 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418612 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418630 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418637 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418654 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418662 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="extract-utilities" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418671 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705cf92b-1b0d-4706-bf30-03fb1a9728cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418680 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="705cf92b-1b0d-4706-bf30-03fb1a9728cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418700 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418708 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418723 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418730 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="extract-content" Oct 09 14:14:46 crc kubenswrapper[4902]: E1009 14:14:46.418742 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418752 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.418994 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f28701-71fd-49c4-8c07-760371aea50f" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.419014 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57fe1c8-a794-46a8-97a8-ed76be62f12a" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.419028 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="705cf92b-1b0d-4706-bf30-03fb1a9728cd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.419043 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc870f9-048c-40f8-902b-cd20b49b584c" containerName="registry-server" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.419845 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.421805 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.422063 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.422746 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.422851 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.426338 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8"] Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.513608 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.513762 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847nc\" (UniqueName: \"kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.513868 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.615881 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.616034 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847nc\" (UniqueName: \"kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.616144 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.621348 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.622239 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.634901 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847nc\" (UniqueName: \"kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:46 crc kubenswrapper[4902]: I1009 14:14:46.745139 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:14:47 crc kubenswrapper[4902]: I1009 14:14:47.252110 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8"] Oct 09 14:14:47 crc kubenswrapper[4902]: I1009 14:14:47.352381 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" event={"ID":"f53eb372-afb1-4f71-b4b3-eb4b36483e5e","Type":"ContainerStarted","Data":"d4f2abec5ed006813fad0cdb0d9654f7b1274dcde768cc4254ba41a0623aebde"} Oct 09 14:14:48 crc kubenswrapper[4902]: I1009 14:14:48.363891 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" event={"ID":"f53eb372-afb1-4f71-b4b3-eb4b36483e5e","Type":"ContainerStarted","Data":"1b16acb66b4aa4a0d0d8b4ad8488ea55f8775307da842be9f820ee2bdd6126e1"} Oct 09 14:14:48 crc kubenswrapper[4902]: I1009 14:14:48.386205 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" podStartSLOduration=1.8367065660000002 podStartE2EDuration="2.386185523s" podCreationTimestamp="2025-10-09 14:14:46 +0000 UTC" firstStartedPulling="2025-10-09 14:14:47.254837191 +0000 UTC m=+1434.452696255" lastFinishedPulling="2025-10-09 14:14:47.804316148 +0000 UTC m=+1435.002175212" observedRunningTime="2025-10-09 14:14:48.377755681 +0000 UTC m=+1435.575614745" watchObservedRunningTime="2025-10-09 14:14:48.386185523 +0000 UTC m=+1435.584044587" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.136340 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5"] Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.138537 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.140908 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.141100 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.145462 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5"] Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.199518 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.199580 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5p4\" (UniqueName: \"kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.200155 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.301708 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.301774 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.301817 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5p4\" (UniqueName: \"kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.302804 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.317132 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.319295 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5p4\" (UniqueName: \"kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4\") pod \"collect-profiles-29333655-vq8n5\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.465247 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:00 crc kubenswrapper[4902]: I1009 14:15:00.924569 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5"] Oct 09 14:15:01 crc kubenswrapper[4902]: I1009 14:15:01.476335 4902 generic.go:334] "Generic (PLEG): container finished" podID="449abebe-b0c0-4f13-b153-05a82a45398b" containerID="bf591366148f679e6374f08499c3a4597c8904fb029ecbc6007a72cb4f8ae723" exitCode=0 Oct 09 14:15:01 crc kubenswrapper[4902]: I1009 14:15:01.477423 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" event={"ID":"449abebe-b0c0-4f13-b153-05a82a45398b","Type":"ContainerDied","Data":"bf591366148f679e6374f08499c3a4597c8904fb029ecbc6007a72cb4f8ae723"} Oct 09 14:15:01 crc kubenswrapper[4902]: I1009 14:15:01.477455 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" event={"ID":"449abebe-b0c0-4f13-b153-05a82a45398b","Type":"ContainerStarted","Data":"7c87b00f9350ed56ba4817e79fb416079ee271721f47e9861d462add8cb3e620"} Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.586733 4902 scope.go:117] "RemoveContainer" containerID="25be80322b10ec09e7df7ccfe8b41efe33db5305aea9af40873cfc9725d576a0" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.608891 4902 scope.go:117] "RemoveContainer" containerID="73223dc891fcb549c3e774588c1807e08b0523ac825c56d90381543ba1bfcdbe" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.639672 4902 scope.go:117] "RemoveContainer" containerID="a243c737804d7ac5228fbba62f60616b0aceb57e80070c119efd5e9023735c2e" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.812128 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.850272 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume\") pod \"449abebe-b0c0-4f13-b153-05a82a45398b\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.850374 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5p4\" (UniqueName: \"kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4\") pod \"449abebe-b0c0-4f13-b153-05a82a45398b\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.850462 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume\") pod \"449abebe-b0c0-4f13-b153-05a82a45398b\" (UID: \"449abebe-b0c0-4f13-b153-05a82a45398b\") " Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.851283 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume" (OuterVolumeSpecName: "config-volume") pod "449abebe-b0c0-4f13-b153-05a82a45398b" (UID: "449abebe-b0c0-4f13-b153-05a82a45398b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.856938 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "449abebe-b0c0-4f13-b153-05a82a45398b" (UID: "449abebe-b0c0-4f13-b153-05a82a45398b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.856947 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4" (OuterVolumeSpecName: "kube-api-access-7g5p4") pod "449abebe-b0c0-4f13-b153-05a82a45398b" (UID: "449abebe-b0c0-4f13-b153-05a82a45398b"). InnerVolumeSpecName "kube-api-access-7g5p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.952182 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/449abebe-b0c0-4f13-b153-05a82a45398b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.952216 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5p4\" (UniqueName: \"kubernetes.io/projected/449abebe-b0c0-4f13-b153-05a82a45398b-kube-api-access-7g5p4\") on node \"crc\" DevicePath \"\"" Oct 09 14:15:02 crc kubenswrapper[4902]: I1009 14:15:02.952225 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/449abebe-b0c0-4f13-b153-05a82a45398b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:15:03 crc kubenswrapper[4902]: I1009 14:15:03.496573 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" event={"ID":"449abebe-b0c0-4f13-b153-05a82a45398b","Type":"ContainerDied","Data":"7c87b00f9350ed56ba4817e79fb416079ee271721f47e9861d462add8cb3e620"} Oct 09 14:15:03 crc kubenswrapper[4902]: I1009 14:15:03.496619 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c87b00f9350ed56ba4817e79fb416079ee271721f47e9861d462add8cb3e620" Oct 09 14:15:03 crc kubenswrapper[4902]: I1009 14:15:03.496653 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5" Oct 09 14:15:20 crc kubenswrapper[4902]: I1009 14:15:20.078440 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:15:20 crc kubenswrapper[4902]: I1009 14:15:20.078925 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.039456 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kw64n"] Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.048843 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sg4gd"] Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.057399 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sg4gd"] Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.066018 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kw64n"] Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.524801 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53238bce-0e1a-4c67-b5eb-ecc5387d41cf" path="/var/lib/kubelet/pods/53238bce-0e1a-4c67-b5eb-ecc5387d41cf/volumes" Oct 09 14:15:31 crc kubenswrapper[4902]: I1009 14:15:31.525826 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ffdab1-dd2a-40c5-b12c-e5018656324a" path="/var/lib/kubelet/pods/e9ffdab1-dd2a-40c5-b12c-e5018656324a/volumes" Oct 09 14:15:36 crc kubenswrapper[4902]: I1009 14:15:36.024544 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n2lrz"] Oct 09 14:15:36 crc kubenswrapper[4902]: I1009 14:15:36.030568 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n2lrz"] Oct 09 14:15:37 crc kubenswrapper[4902]: I1009 14:15:37.526352 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bd9da0-075a-4827-90bc-72c879d60820" path="/var/lib/kubelet/pods/f2bd9da0-075a-4827-90bc-72c879d60820/volumes" Oct 09 14:15:42 crc kubenswrapper[4902]: I1009 14:15:42.033603 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-07af-account-create-xjlvw"] Oct 09 14:15:42 crc kubenswrapper[4902]: I1009 14:15:42.041679 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9532-account-create-sznbs"] Oct 09 14:15:42 crc kubenswrapper[4902]: I1009 14:15:42.049212 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9532-account-create-sznbs"] Oct 09 14:15:42 crc kubenswrapper[4902]: I1009 14:15:42.056615 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-07af-account-create-xjlvw"] Oct 09 14:15:43 crc kubenswrapper[4902]: I1009 14:15:43.525933 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81276c5a-6547-4625-b10e-edb12dc107b9" path="/var/lib/kubelet/pods/81276c5a-6547-4625-b10e-edb12dc107b9/volumes" Oct 09 14:15:43 crc kubenswrapper[4902]: I1009 14:15:43.527514 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed26f964-2a36-43d7-80f5-7d7e60a32e4f" path="/var/lib/kubelet/pods/ed26f964-2a36-43d7-80f5-7d7e60a32e4f/volumes" Oct 09 14:15:48 crc kubenswrapper[4902]: I1009 14:15:48.038021 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-104a-account-create-zfqcp"] Oct 09 14:15:48 crc kubenswrapper[4902]: I1009 14:15:48.045898 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-104a-account-create-zfqcp"] Oct 09 14:15:49 crc kubenswrapper[4902]: I1009 14:15:49.523314 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48eb08a1-1c69-4180-a5aa-75ceb3fd6f41" path="/var/lib/kubelet/pods/48eb08a1-1c69-4180-a5aa-75ceb3fd6f41/volumes" Oct 09 14:15:50 crc kubenswrapper[4902]: I1009 14:15:50.078249 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:15:50 crc kubenswrapper[4902]: I1009 14:15:50.078629 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.708508 4902 scope.go:117] "RemoveContainer" containerID="d3d45dde663256231b9c482a7f7f17a7468937d327a40c16ae673923a4da10ec" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.738033 4902 scope.go:117] "RemoveContainer" containerID="03bbc2169c8034cc6260c16e440564a0e39c9784dc7d6f6e9a8ba63d7f106c78" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.778843 4902 scope.go:117] "RemoveContainer" containerID="06973ced0cf4365c382c831e1110cc9681973907c0defd082b9536f8146bd686" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.839994 4902 scope.go:117] "RemoveContainer" containerID="ee60c0a5698fea52999638b4ce5cd607a71c0c72a9b72fa6ea62ebd499ba3540" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.883812 4902 scope.go:117] "RemoveContainer" containerID="9a1fd6740cd26069d16c76e18dde3d6c92ef84e9731d12bd6aa3e2ddb25e3e55" Oct 09 14:16:02 crc kubenswrapper[4902]: I1009 14:16:02.925858 4902 scope.go:117] "RemoveContainer" containerID="1377593959dcd4fb8c1f19f71b7c020997bdc7f2a2df909808f0076f1a0248f4" Oct 09 14:16:06 crc kubenswrapper[4902]: I1009 14:16:06.033959 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6rs8p"] Oct 09 14:16:06 crc kubenswrapper[4902]: I1009 14:16:06.044695 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6rs8p"] Oct 09 14:16:07 crc kubenswrapper[4902]: I1009 14:16:07.523628 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acda5664-02c5-48ec-92a8-b7f2274d34a7" path="/var/lib/kubelet/pods/acda5664-02c5-48ec-92a8-b7f2274d34a7/volumes" Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.032000 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qk2p7"] Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.041365 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qk2p7"] Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.050776 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9q89d"] Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.058075 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9q89d"] Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.527876 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947813b1-569a-485d-80de-98259162031c" path="/var/lib/kubelet/pods/947813b1-569a-485d-80de-98259162031c/volumes" Oct 09 14:16:09 crc kubenswrapper[4902]: I1009 14:16:09.528593 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2b155e-e715-4cbd-8d0e-41944380c158" path="/var/lib/kubelet/pods/ca2b155e-e715-4cbd-8d0e-41944380c158/volumes" Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.035169 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xtkdj"] Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.043689 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9x8lm"] Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.055847 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xtkdj"] Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.063563 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9x8lm"] Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.524372 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cef403-03b5-432b-a7b4-c12a613e4d47" path="/var/lib/kubelet/pods/15cef403-03b5-432b-a7b4-c12a613e4d47/volumes" Oct 09 14:16:13 crc kubenswrapper[4902]: I1009 14:16:13.525392 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74606141-205f-4e63-9be4-1c7c57fecf3b" path="/var/lib/kubelet/pods/74606141-205f-4e63-9be4-1c7c57fecf3b/volumes" Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.078202 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.078736 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.078796 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.079860 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.079945 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" gracePeriod=600 Oct 09 14:16:20 crc kubenswrapper[4902]: E1009 14:16:20.242438 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.246806 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" exitCode=0 Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.246854 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45"} Oct 09 14:16:20 crc kubenswrapper[4902]: I1009 14:16:20.246909 4902 scope.go:117] "RemoveContainer" containerID="810d12ae911af2f4b29c9130d86bd6cc100568c44cc3f31fb981a87e8efd1049" Oct 09 14:16:21 crc kubenswrapper[4902]: I1009 14:16:21.260112 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:16:21 crc kubenswrapper[4902]: E1009 14:16:21.260833 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:16:32 crc kubenswrapper[4902]: I1009 14:16:32.357057 4902 generic.go:334] "Generic (PLEG): container finished" podID="f53eb372-afb1-4f71-b4b3-eb4b36483e5e" containerID="1b16acb66b4aa4a0d0d8b4ad8488ea55f8775307da842be9f820ee2bdd6126e1" exitCode=0 Oct 09 14:16:32 crc kubenswrapper[4902]: I1009 14:16:32.357158 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" event={"ID":"f53eb372-afb1-4f71-b4b3-eb4b36483e5e","Type":"ContainerDied","Data":"1b16acb66b4aa4a0d0d8b4ad8488ea55f8775307da842be9f820ee2bdd6126e1"} Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.031835 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fce1-account-create-8hwdj"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.042199 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4963-account-create-z5jmr"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.053708 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fce1-account-create-8hwdj"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.060515 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97aa-account-create-tzndb"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.067273 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4963-account-create-z5jmr"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.074078 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-97aa-account-create-tzndb"] Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.549363 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453bc6c7-4789-47fb-a83d-21062c6069dd" path="/var/lib/kubelet/pods/453bc6c7-4789-47fb-a83d-21062c6069dd/volumes" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.550091 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b125855d-ef08-4d86-b4ca-a89b06964590" path="/var/lib/kubelet/pods/b125855d-ef08-4d86-b4ca-a89b06964590/volumes" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.550639 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf729a68-4f96-4839-8c65-8be1543d04da" path="/var/lib/kubelet/pods/cf729a68-4f96-4839-8c65-8be1543d04da/volumes" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.793694 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.880152 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key\") pod \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.880467 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory\") pod \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.880505 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847nc\" (UniqueName: \"kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc\") pod \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\" (UID: \"f53eb372-afb1-4f71-b4b3-eb4b36483e5e\") " Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.887116 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc" (OuterVolumeSpecName: "kube-api-access-847nc") pod "f53eb372-afb1-4f71-b4b3-eb4b36483e5e" (UID: "f53eb372-afb1-4f71-b4b3-eb4b36483e5e"). InnerVolumeSpecName "kube-api-access-847nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.914123 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory" (OuterVolumeSpecName: "inventory") pod "f53eb372-afb1-4f71-b4b3-eb4b36483e5e" (UID: "f53eb372-afb1-4f71-b4b3-eb4b36483e5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.919925 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f53eb372-afb1-4f71-b4b3-eb4b36483e5e" (UID: "f53eb372-afb1-4f71-b4b3-eb4b36483e5e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.982448 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.982502 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847nc\" (UniqueName: \"kubernetes.io/projected/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-kube-api-access-847nc\") on node \"crc\" DevicePath \"\"" Oct 09 14:16:33 crc kubenswrapper[4902]: I1009 14:16:33.982516 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f53eb372-afb1-4f71-b4b3-eb4b36483e5e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.374163 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" event={"ID":"f53eb372-afb1-4f71-b4b3-eb4b36483e5e","Type":"ContainerDied","Data":"d4f2abec5ed006813fad0cdb0d9654f7b1274dcde768cc4254ba41a0623aebde"} Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.374240 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f2abec5ed006813fad0cdb0d9654f7b1274dcde768cc4254ba41a0623aebde" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.374210 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.458312 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg"] Oct 09 14:16:34 crc kubenswrapper[4902]: E1009 14:16:34.459125 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449abebe-b0c0-4f13-b153-05a82a45398b" containerName="collect-profiles" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.459157 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="449abebe-b0c0-4f13-b153-05a82a45398b" containerName="collect-profiles" Oct 09 14:16:34 crc kubenswrapper[4902]: E1009 14:16:34.459230 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53eb372-afb1-4f71-b4b3-eb4b36483e5e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.459243 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53eb372-afb1-4f71-b4b3-eb4b36483e5e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.459508 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53eb372-afb1-4f71-b4b3-eb4b36483e5e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.459564 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="449abebe-b0c0-4f13-b153-05a82a45398b" containerName="collect-profiles" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.460747 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.463359 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.463611 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.464008 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.464653 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.480149 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg"] Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.595878 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb962\" (UniqueName: \"kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.595927 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.595964 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.698227 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb962\" (UniqueName: \"kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.698301 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.698376 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.705228 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.708886 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.716393 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb962\" (UniqueName: \"kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:34 crc kubenswrapper[4902]: I1009 14:16:34.791906 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:16:35 crc kubenswrapper[4902]: I1009 14:16:35.318994 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg"] Oct 09 14:16:35 crc kubenswrapper[4902]: I1009 14:16:35.325350 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:16:35 crc kubenswrapper[4902]: I1009 14:16:35.384688 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" event={"ID":"18dcb9ba-f068-421a-a11c-f25f2b7c940a","Type":"ContainerStarted","Data":"0401511d390a1da131d8562174885454590bb2901380143209cee596765e89a4"} Oct 09 14:16:36 crc kubenswrapper[4902]: I1009 14:16:36.513502 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:16:36 crc kubenswrapper[4902]: E1009 14:16:36.514138 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:16:37 crc kubenswrapper[4902]: I1009 14:16:37.407662 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" event={"ID":"18dcb9ba-f068-421a-a11c-f25f2b7c940a","Type":"ContainerStarted","Data":"32a917ae5b4732d4987449b7170d0613b6b36a70d18d0764dd991e5a66cd77cf"} Oct 09 14:16:37 crc kubenswrapper[4902]: I1009 14:16:37.432747 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" podStartSLOduration=2.13381238 podStartE2EDuration="3.432727656s" podCreationTimestamp="2025-10-09 14:16:34 +0000 UTC" firstStartedPulling="2025-10-09 14:16:35.323862879 +0000 UTC m=+1542.521721943" lastFinishedPulling="2025-10-09 14:16:36.622778155 +0000 UTC m=+1543.820637219" observedRunningTime="2025-10-09 14:16:37.429839063 +0000 UTC m=+1544.627698137" watchObservedRunningTime="2025-10-09 14:16:37.432727656 +0000 UTC m=+1544.630586720" Oct 09 14:16:41 crc kubenswrapper[4902]: I1009 14:16:41.031236 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-74gvb"] Oct 09 14:16:41 crc kubenswrapper[4902]: I1009 14:16:41.040872 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-74gvb"] Oct 09 14:16:41 crc kubenswrapper[4902]: I1009 14:16:41.532021 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9fd56c3-4837-4fad-b343-b2edc61b0605" path="/var/lib/kubelet/pods/f9fd56c3-4837-4fad-b343-b2edc61b0605/volumes" Oct 09 14:16:50 crc kubenswrapper[4902]: I1009 14:16:50.513268 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:16:50 crc kubenswrapper[4902]: E1009 14:16:50.514030 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.073150 4902 scope.go:117] "RemoveContainer" containerID="2d4ad1f7db7bac1656d7fa585602c537d6b102be11fcf14fc440f3379faa01c9" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.115253 4902 scope.go:117] "RemoveContainer" containerID="54b7945ee3849ebd9c01e5165f770199cb6deab03768d690792ab8b6342d1f91" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.147982 4902 scope.go:117] "RemoveContainer" containerID="7b0306fdcaa0ccafb2093670cdd941dc2050fd83511ceab9217072e1024c84e5" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.193230 4902 scope.go:117] "RemoveContainer" containerID="0d4a3a589386a9cd13bfb09da356a1cfd0ca577ef1346af683beed1ceea21658" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.239431 4902 scope.go:117] "RemoveContainer" containerID="967817fe37cdb809a8b18c6629dfe923d474e3695ea7664a196320318d0a9207" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.300638 4902 scope.go:117] "RemoveContainer" containerID="7c6f35af69c72387851257c9f1dcd774f87345a47f1009133d736adfd6ae360d" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.330023 4902 scope.go:117] "RemoveContainer" containerID="6e4af23529a1f854346f6beaf5c6da3edbcffa61f984196643c2956871072757" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.352217 4902 scope.go:117] "RemoveContainer" containerID="d65f098b3327a5bc6b4c231a31a7ae200c84332a143b46c70378caac2afd7525" Oct 09 14:17:03 crc kubenswrapper[4902]: I1009 14:17:03.379154 4902 scope.go:117] "RemoveContainer" containerID="a42c93937cbc00fea898e805942c7cf81cb448aac4ebe11387f3d94204535416" Oct 09 14:17:04 crc kubenswrapper[4902]: I1009 14:17:04.513071 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:17:04 crc kubenswrapper[4902]: E1009 14:17:04.513336 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.045229 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jqrks"] Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.053968 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-r8v96"] Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.063078 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-r8v96"] Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.070797 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jqrks"] Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.531863 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9c3435-39c9-4af3-bbf9-70faafc22a3e" path="/var/lib/kubelet/pods/0b9c3435-39c9-4af3-bbf9-70faafc22a3e/volumes" Oct 09 14:17:07 crc kubenswrapper[4902]: I1009 14:17:07.532533 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3545e73-7c0f-4e1a-b012-da9e7f35a0b5" path="/var/lib/kubelet/pods/d3545e73-7c0f-4e1a-b012-da9e7f35a0b5/volumes" Oct 09 14:17:16 crc kubenswrapper[4902]: I1009 14:17:16.513468 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:17:16 crc kubenswrapper[4902]: E1009 14:17:16.514294 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:27 crc kubenswrapper[4902]: I1009 14:17:27.038684 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-92hj5"] Oct 09 14:17:27 crc kubenswrapper[4902]: I1009 14:17:27.048022 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-92hj5"] Oct 09 14:17:27 crc kubenswrapper[4902]: I1009 14:17:27.524654 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4015da27-1c19-4eb1-af33-74e182b53aa3" path="/var/lib/kubelet/pods/4015da27-1c19-4eb1-af33-74e182b53aa3/volumes" Oct 09 14:17:29 crc kubenswrapper[4902]: I1009 14:17:29.045904 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4fhrn"] Oct 09 14:17:29 crc kubenswrapper[4902]: I1009 14:17:29.068690 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4fhrn"] Oct 09 14:17:29 crc kubenswrapper[4902]: I1009 14:17:29.513076 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:17:29 crc kubenswrapper[4902]: E1009 14:17:29.513325 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:29 crc kubenswrapper[4902]: I1009 14:17:29.524070 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87" path="/var/lib/kubelet/pods/e0ea72ba-e66d-4400-a1d9-8d4cb8a94e87/volumes" Oct 09 14:17:34 crc kubenswrapper[4902]: I1009 14:17:34.031513 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jn679"] Oct 09 14:17:34 crc kubenswrapper[4902]: I1009 14:17:34.039733 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jn679"] Oct 09 14:17:35 crc kubenswrapper[4902]: I1009 14:17:35.029263 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nr94d"] Oct 09 14:17:35 crc kubenswrapper[4902]: I1009 14:17:35.038889 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nr94d"] Oct 09 14:17:35 crc kubenswrapper[4902]: I1009 14:17:35.524078 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80011324-6959-416f-a2e0-1ea80e7b32fd" path="/var/lib/kubelet/pods/80011324-6959-416f-a2e0-1ea80e7b32fd/volumes" Oct 09 14:17:35 crc kubenswrapper[4902]: I1009 14:17:35.524642 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace7c5ce-f1fc-4600-9764-0e357b80f849" path="/var/lib/kubelet/pods/ace7c5ce-f1fc-4600-9764-0e357b80f849/volumes" Oct 09 14:17:37 crc kubenswrapper[4902]: I1009 14:17:37.028308 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zjfzf"] Oct 09 14:17:37 crc kubenswrapper[4902]: I1009 14:17:37.037571 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zjfzf"] Oct 09 14:17:37 crc kubenswrapper[4902]: I1009 14:17:37.540547 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a" path="/var/lib/kubelet/pods/9bdf36f9-3d35-4ccd-9d32-86c5ab08b65a/volumes" Oct 09 14:17:43 crc kubenswrapper[4902]: I1009 14:17:43.519214 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:17:43 crc kubenswrapper[4902]: E1009 14:17:43.521500 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.036667 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b8a5-account-create-6m6lx"] Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.045120 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b741-account-create-22xhv"] Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.054194 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b8a5-account-create-6m6lx"] Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.063152 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b741-account-create-22xhv"] Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.525638 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21db4847-19e9-4e2e-b745-319609bc39e7" path="/var/lib/kubelet/pods/21db4847-19e9-4e2e-b745-319609bc39e7/volumes" Oct 09 14:17:45 crc kubenswrapper[4902]: I1009 14:17:45.526251 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22a652a-afff-4bc3-ad24-30d9abfa577f" path="/var/lib/kubelet/pods/d22a652a-afff-4bc3-ad24-30d9abfa577f/volumes" Oct 09 14:17:55 crc kubenswrapper[4902]: I1009 14:17:55.031455 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b001-account-create-lmts2"] Oct 09 14:17:55 crc kubenswrapper[4902]: I1009 14:17:55.042675 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b001-account-create-lmts2"] Oct 09 14:17:55 crc kubenswrapper[4902]: I1009 14:17:55.525577 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e924c89-5324-4bc9-9e4d-9005d6d1257d" path="/var/lib/kubelet/pods/2e924c89-5324-4bc9-9e4d-9005d6d1257d/volumes" Oct 09 14:17:56 crc kubenswrapper[4902]: I1009 14:17:56.109426 4902 generic.go:334] "Generic (PLEG): container finished" podID="18dcb9ba-f068-421a-a11c-f25f2b7c940a" containerID="32a917ae5b4732d4987449b7170d0613b6b36a70d18d0764dd991e5a66cd77cf" exitCode=0 Oct 09 14:17:56 crc kubenswrapper[4902]: I1009 14:17:56.109449 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" event={"ID":"18dcb9ba-f068-421a-a11c-f25f2b7c940a","Type":"ContainerDied","Data":"32a917ae5b4732d4987449b7170d0613b6b36a70d18d0764dd991e5a66cd77cf"} Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.577976 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.624101 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory\") pod \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.624190 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key\") pod \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.624251 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb962\" (UniqueName: \"kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962\") pod \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\" (UID: \"18dcb9ba-f068-421a-a11c-f25f2b7c940a\") " Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.629875 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962" (OuterVolumeSpecName: "kube-api-access-lb962") pod "18dcb9ba-f068-421a-a11c-f25f2b7c940a" (UID: "18dcb9ba-f068-421a-a11c-f25f2b7c940a"). InnerVolumeSpecName "kube-api-access-lb962". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.654219 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory" (OuterVolumeSpecName: "inventory") pod "18dcb9ba-f068-421a-a11c-f25f2b7c940a" (UID: "18dcb9ba-f068-421a-a11c-f25f2b7c940a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.656453 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18dcb9ba-f068-421a-a11c-f25f2b7c940a" (UID: "18dcb9ba-f068-421a-a11c-f25f2b7c940a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.726223 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.726269 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18dcb9ba-f068-421a-a11c-f25f2b7c940a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:17:57 crc kubenswrapper[4902]: I1009 14:17:57.726286 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb962\" (UniqueName: \"kubernetes.io/projected/18dcb9ba-f068-421a-a11c-f25f2b7c940a-kube-api-access-lb962\") on node \"crc\" DevicePath \"\"" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.134471 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" event={"ID":"18dcb9ba-f068-421a-a11c-f25f2b7c940a","Type":"ContainerDied","Data":"0401511d390a1da131d8562174885454590bb2901380143209cee596765e89a4"} Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.134517 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0401511d390a1da131d8562174885454590bb2901380143209cee596765e89a4" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.134569 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.239343 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf"] Oct 09 14:17:58 crc kubenswrapper[4902]: E1009 14:17:58.239807 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dcb9ba-f068-421a-a11c-f25f2b7c940a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.239828 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dcb9ba-f068-421a-a11c-f25f2b7c940a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.240074 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dcb9ba-f068-421a-a11c-f25f2b7c940a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.240833 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.244083 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.244262 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.247783 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.248506 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.263558 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf"] Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.356916 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.357464 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4zgc\" (UniqueName: \"kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.357852 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.459396 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.459519 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4zgc\" (UniqueName: \"kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.459608 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.464255 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.475384 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.478181 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4zgc\" (UniqueName: \"kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-q62lf\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.513727 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:17:58 crc kubenswrapper[4902]: E1009 14:17:58.514167 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:17:58 crc kubenswrapper[4902]: I1009 14:17:58.561359 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:17:59 crc kubenswrapper[4902]: I1009 14:17:59.262522 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf"] Oct 09 14:17:59 crc kubenswrapper[4902]: W1009 14:17:59.269572 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17bda176_986b_468d_b839_a58df9d3cf58.slice/crio-c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4 WatchSource:0}: Error finding container c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4: Status 404 returned error can't find the container with id c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4 Oct 09 14:18:00 crc kubenswrapper[4902]: I1009 14:18:00.156244 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" event={"ID":"17bda176-986b-468d-b839-a58df9d3cf58","Type":"ContainerStarted","Data":"c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4"} Oct 09 14:18:01 crc kubenswrapper[4902]: I1009 14:18:01.167558 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" event={"ID":"17bda176-986b-468d-b839-a58df9d3cf58","Type":"ContainerStarted","Data":"9477ea552652036da80c6c575c89f61685e348d40351a8d94a142bd8996a9599"} Oct 09 14:18:01 crc kubenswrapper[4902]: I1009 14:18:01.192291 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" podStartSLOduration=1.683853542 podStartE2EDuration="3.192272201s" podCreationTimestamp="2025-10-09 14:17:58 +0000 UTC" firstStartedPulling="2025-10-09 14:17:59.274658907 +0000 UTC m=+1626.472517971" lastFinishedPulling="2025-10-09 14:18:00.783077566 +0000 UTC m=+1627.980936630" observedRunningTime="2025-10-09 14:18:01.187592933 +0000 UTC m=+1628.385452007" watchObservedRunningTime="2025-10-09 14:18:01.192272201 +0000 UTC m=+1628.390131265" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.587637 4902 scope.go:117] "RemoveContainer" containerID="dcf43d2ded8473b0968d7a945e64b57ce1fd50b9d7f3fe4750e499d4f452f884" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.614311 4902 scope.go:117] "RemoveContainer" containerID="3699212da17a1fe693ce60b6829ec80bec60e69618797d80f1b8a3a680dab960" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.682898 4902 scope.go:117] "RemoveContainer" containerID="589e134088889030f30963da599dce0475033fd7998c03b92d4dd6135c809d8a" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.780601 4902 scope.go:117] "RemoveContainer" containerID="18cb5b774c2f10daf84263ccd477516848d97f22989e76a725ce94bca73ec75c" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.805474 4902 scope.go:117] "RemoveContainer" containerID="fed6f04acf067026499dd56f180834119dc95b524b9117a96995cbf1bba9ddb4" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.855163 4902 scope.go:117] "RemoveContainer" containerID="3c7bb102af997ffe7fa43482ff49863637e171d1dfcf85b433bb21af2e67ed44" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.905076 4902 scope.go:117] "RemoveContainer" containerID="2514a005b3622e668714ceb7bbaddf3c2ce91f45b6051763bcf877e9258ca8de" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.924316 4902 scope.go:117] "RemoveContainer" containerID="88e6cbbc2443b0e5c8d486696e9f4c9d25b6b1ac059f60994a39dd1719be60d8" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.944746 4902 scope.go:117] "RemoveContainer" containerID="4e4e54afcc6992a1cdedb8011a0ce92f11d88459d6c67a20faf151cff2071790" Oct 09 14:18:03 crc kubenswrapper[4902]: I1009 14:18:03.975437 4902 scope.go:117] "RemoveContainer" containerID="3912a83ba98f7248edcd6a5b32482a4371d739c557735a8b3836099aaec0d311" Oct 09 14:18:06 crc kubenswrapper[4902]: I1009 14:18:06.224777 4902 generic.go:334] "Generic (PLEG): container finished" podID="17bda176-986b-468d-b839-a58df9d3cf58" containerID="9477ea552652036da80c6c575c89f61685e348d40351a8d94a142bd8996a9599" exitCode=0 Oct 09 14:18:06 crc kubenswrapper[4902]: I1009 14:18:06.225023 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" event={"ID":"17bda176-986b-468d-b839-a58df9d3cf58","Type":"ContainerDied","Data":"9477ea552652036da80c6c575c89f61685e348d40351a8d94a142bd8996a9599"} Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.695577 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.750631 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4zgc\" (UniqueName: \"kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc\") pod \"17bda176-986b-468d-b839-a58df9d3cf58\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.751121 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key\") pod \"17bda176-986b-468d-b839-a58df9d3cf58\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.751198 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory\") pod \"17bda176-986b-468d-b839-a58df9d3cf58\" (UID: \"17bda176-986b-468d-b839-a58df9d3cf58\") " Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.757574 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc" (OuterVolumeSpecName: "kube-api-access-k4zgc") pod "17bda176-986b-468d-b839-a58df9d3cf58" (UID: "17bda176-986b-468d-b839-a58df9d3cf58"). InnerVolumeSpecName "kube-api-access-k4zgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.781284 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17bda176-986b-468d-b839-a58df9d3cf58" (UID: "17bda176-986b-468d-b839-a58df9d3cf58"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.782975 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory" (OuterVolumeSpecName: "inventory") pod "17bda176-986b-468d-b839-a58df9d3cf58" (UID: "17bda176-986b-468d-b839-a58df9d3cf58"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.853544 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.853716 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17bda176-986b-468d-b839-a58df9d3cf58-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:07 crc kubenswrapper[4902]: I1009 14:18:07.853796 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4zgc\" (UniqueName: \"kubernetes.io/projected/17bda176-986b-468d-b839-a58df9d3cf58-kube-api-access-k4zgc\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.244000 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" event={"ID":"17bda176-986b-468d-b839-a58df9d3cf58","Type":"ContainerDied","Data":"c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4"} Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.244044 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-q62lf" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.244050 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c352d8050cf4f90e1284cdc4baa38095a63e8d8ec5a18666d09c3b0b5f0721a4" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.351658 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p"] Oct 09 14:18:08 crc kubenswrapper[4902]: E1009 14:18:08.359871 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17bda176-986b-468d-b839-a58df9d3cf58" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.359917 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="17bda176-986b-468d-b839-a58df9d3cf58" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.360163 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="17bda176-986b-468d-b839-a58df9d3cf58" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.360984 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.366059 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.366299 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.366439 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.367284 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.377148 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p"] Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.467067 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.467380 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzt2\" (UniqueName: \"kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.467592 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.569881 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.570059 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzt2\" (UniqueName: \"kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.570095 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.573735 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.580052 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.588081 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzt2\" (UniqueName: \"kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lsz7p\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:08 crc kubenswrapper[4902]: I1009 14:18:08.686195 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:09 crc kubenswrapper[4902]: I1009 14:18:09.234800 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p"] Oct 09 14:18:09 crc kubenswrapper[4902]: I1009 14:18:09.252490 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" event={"ID":"9dd56ad2-b36d-4850-81d2-b06db395ecd6","Type":"ContainerStarted","Data":"31186678f433f3d8de71f0753e54b2ff3f290de311f1631a7ce0cc6957ebd46e"} Oct 09 14:18:12 crc kubenswrapper[4902]: I1009 14:18:12.279205 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" event={"ID":"9dd56ad2-b36d-4850-81d2-b06db395ecd6","Type":"ContainerStarted","Data":"da50097d842556663067655c537e9cbfd0f7f09adfa61c7b9303c66d995f7793"} Oct 09 14:18:12 crc kubenswrapper[4902]: I1009 14:18:12.301188 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" podStartSLOduration=2.626961927 podStartE2EDuration="4.301164093s" podCreationTimestamp="2025-10-09 14:18:08 +0000 UTC" firstStartedPulling="2025-10-09 14:18:09.240141847 +0000 UTC m=+1636.438000911" lastFinishedPulling="2025-10-09 14:18:10.914344013 +0000 UTC m=+1638.112203077" observedRunningTime="2025-10-09 14:18:12.295436954 +0000 UTC m=+1639.493296038" watchObservedRunningTime="2025-10-09 14:18:12.301164093 +0000 UTC m=+1639.499023167" Oct 09 14:18:12 crc kubenswrapper[4902]: I1009 14:18:12.513505 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:18:12 crc kubenswrapper[4902]: E1009 14:18:12.514000 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:18:15 crc kubenswrapper[4902]: I1009 14:18:15.051475 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs8sb"] Oct 09 14:18:15 crc kubenswrapper[4902]: I1009 14:18:15.065390 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bs8sb"] Oct 09 14:18:15 crc kubenswrapper[4902]: I1009 14:18:15.525302 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbeddafe-9239-4c02-8f7d-f4a341ee97b1" path="/var/lib/kubelet/pods/cbeddafe-9239-4c02-8f7d-f4a341ee97b1/volumes" Oct 09 14:18:25 crc kubenswrapper[4902]: I1009 14:18:25.514108 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:18:25 crc kubenswrapper[4902]: E1009 14:18:25.514919 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:18:36 crc kubenswrapper[4902]: I1009 14:18:36.513536 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:18:36 crc kubenswrapper[4902]: E1009 14:18:36.514347 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:18:38 crc kubenswrapper[4902]: I1009 14:18:38.028581 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qc4zz"] Oct 09 14:18:38 crc kubenswrapper[4902]: I1009 14:18:38.036501 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qc4zz"] Oct 09 14:18:39 crc kubenswrapper[4902]: I1009 14:18:39.523332 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9b526e-bd2b-4710-9225-669035576d7c" path="/var/lib/kubelet/pods/1e9b526e-bd2b-4710-9225-669035576d7c/volumes" Oct 09 14:18:40 crc kubenswrapper[4902]: I1009 14:18:40.028823 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ssjrs"] Oct 09 14:18:40 crc kubenswrapper[4902]: I1009 14:18:40.036154 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ssjrs"] Oct 09 14:18:41 crc kubenswrapper[4902]: I1009 14:18:41.525202 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc21b931-be3c-442f-b886-3f245f88e079" path="/var/lib/kubelet/pods/cc21b931-be3c-442f-b886-3f245f88e079/volumes" Oct 09 14:18:47 crc kubenswrapper[4902]: I1009 14:18:47.513647 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:18:47 crc kubenswrapper[4902]: E1009 14:18:47.514540 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:18:48 crc kubenswrapper[4902]: I1009 14:18:48.585140 4902 generic.go:334] "Generic (PLEG): container finished" podID="9dd56ad2-b36d-4850-81d2-b06db395ecd6" containerID="da50097d842556663067655c537e9cbfd0f7f09adfa61c7b9303c66d995f7793" exitCode=0 Oct 09 14:18:48 crc kubenswrapper[4902]: I1009 14:18:48.585219 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" event={"ID":"9dd56ad2-b36d-4850-81d2-b06db395ecd6","Type":"ContainerDied","Data":"da50097d842556663067655c537e9cbfd0f7f09adfa61c7b9303c66d995f7793"} Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.041291 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.084822 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpzt2\" (UniqueName: \"kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2\") pod \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.084896 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key\") pod \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.085140 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory\") pod \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\" (UID: \"9dd56ad2-b36d-4850-81d2-b06db395ecd6\") " Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.093784 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2" (OuterVolumeSpecName: "kube-api-access-qpzt2") pod "9dd56ad2-b36d-4850-81d2-b06db395ecd6" (UID: "9dd56ad2-b36d-4850-81d2-b06db395ecd6"). InnerVolumeSpecName "kube-api-access-qpzt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.114794 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9dd56ad2-b36d-4850-81d2-b06db395ecd6" (UID: "9dd56ad2-b36d-4850-81d2-b06db395ecd6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.115284 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory" (OuterVolumeSpecName: "inventory") pod "9dd56ad2-b36d-4850-81d2-b06db395ecd6" (UID: "9dd56ad2-b36d-4850-81d2-b06db395ecd6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.187565 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.187611 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpzt2\" (UniqueName: \"kubernetes.io/projected/9dd56ad2-b36d-4850-81d2-b06db395ecd6-kube-api-access-qpzt2\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.187621 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9dd56ad2-b36d-4850-81d2-b06db395ecd6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.605800 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" event={"ID":"9dd56ad2-b36d-4850-81d2-b06db395ecd6","Type":"ContainerDied","Data":"31186678f433f3d8de71f0753e54b2ff3f290de311f1631a7ce0cc6957ebd46e"} Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.605852 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31186678f433f3d8de71f0753e54b2ff3f290de311f1631a7ce0cc6957ebd46e" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.605851 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lsz7p" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.686558 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4"] Oct 09 14:18:50 crc kubenswrapper[4902]: E1009 14:18:50.687060 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd56ad2-b36d-4850-81d2-b06db395ecd6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.687085 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd56ad2-b36d-4850-81d2-b06db395ecd6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.687372 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd56ad2-b36d-4850-81d2-b06db395ecd6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.688175 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.690276 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.693197 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.693221 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.693731 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.702055 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4"] Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.799989 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j98q\" (UniqueName: \"kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.800064 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.800107 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.902454 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j98q\" (UniqueName: \"kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.902542 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.902594 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.906386 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.906445 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:50 crc kubenswrapper[4902]: I1009 14:18:50.921219 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j98q\" (UniqueName: \"kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:51 crc kubenswrapper[4902]: I1009 14:18:51.018338 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:18:51 crc kubenswrapper[4902]: I1009 14:18:51.526138 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4"] Oct 09 14:18:51 crc kubenswrapper[4902]: I1009 14:18:51.619225 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" event={"ID":"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e","Type":"ContainerStarted","Data":"d47ff5a69bdaf7d641d98852ce3a434fb97a7a74c0be8a2dcaffab1275dc513b"} Oct 09 14:18:52 crc kubenswrapper[4902]: I1009 14:18:52.628524 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" event={"ID":"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e","Type":"ContainerStarted","Data":"984a299f496c455d27862106b4fd40b55785e18be551e666b96f9a1bdedbf092"} Oct 09 14:19:02 crc kubenswrapper[4902]: I1009 14:19:02.513521 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:19:02 crc kubenswrapper[4902]: E1009 14:19:02.515402 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:19:04 crc kubenswrapper[4902]: I1009 14:19:04.167340 4902 scope.go:117] "RemoveContainer" containerID="aac8618970479222f2878823905a1d2d396933c89907b7f32ea6296a3c8232a4" Oct 09 14:19:04 crc kubenswrapper[4902]: I1009 14:19:04.217200 4902 scope.go:117] "RemoveContainer" containerID="4611e4a7d63368eb1993e398373d1774eaf3281d47466a275e578c08d76a0f05" Oct 09 14:19:04 crc kubenswrapper[4902]: I1009 14:19:04.265715 4902 scope.go:117] "RemoveContainer" containerID="ab0c57d28e288cef2fb78a7e8287cd91048ac50617bc07e6add91f401c5e73cb" Oct 09 14:19:14 crc kubenswrapper[4902]: I1009 14:19:14.512650 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:19:14 crc kubenswrapper[4902]: E1009 14:19:14.513437 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:19:23 crc kubenswrapper[4902]: I1009 14:19:23.047460 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" podStartSLOduration=32.447503327 podStartE2EDuration="33.047398411s" podCreationTimestamp="2025-10-09 14:18:50 +0000 UTC" firstStartedPulling="2025-10-09 14:18:51.538127057 +0000 UTC m=+1678.735986121" lastFinishedPulling="2025-10-09 14:18:52.138022131 +0000 UTC m=+1679.335881205" observedRunningTime="2025-10-09 14:18:52.647757598 +0000 UTC m=+1679.845616662" watchObservedRunningTime="2025-10-09 14:19:23.047398411 +0000 UTC m=+1710.245257495" Oct 09 14:19:23 crc kubenswrapper[4902]: I1009 14:19:23.049374 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gn99q"] Oct 09 14:19:23 crc kubenswrapper[4902]: I1009 14:19:23.060849 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gn99q"] Oct 09 14:19:23 crc kubenswrapper[4902]: I1009 14:19:23.526260 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec866e85-92a2-4934-89a1-c44f1dcf2f52" path="/var/lib/kubelet/pods/ec866e85-92a2-4934-89a1-c44f1dcf2f52/volumes" Oct 09 14:19:29 crc kubenswrapper[4902]: I1009 14:19:29.513081 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:19:29 crc kubenswrapper[4902]: E1009 14:19:29.513780 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:19:43 crc kubenswrapper[4902]: I1009 14:19:43.525163 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:19:43 crc kubenswrapper[4902]: E1009 14:19:43.525797 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:19:48 crc kubenswrapper[4902]: I1009 14:19:48.104929 4902 generic.go:334] "Generic (PLEG): container finished" podID="7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" containerID="984a299f496c455d27862106b4fd40b55785e18be551e666b96f9a1bdedbf092" exitCode=2 Oct 09 14:19:48 crc kubenswrapper[4902]: I1009 14:19:48.105023 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" event={"ID":"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e","Type":"ContainerDied","Data":"984a299f496c455d27862106b4fd40b55785e18be551e666b96f9a1bdedbf092"} Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.559573 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.677378 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key\") pod \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.677756 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory\") pod \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.677806 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j98q\" (UniqueName: \"kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q\") pod \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\" (UID: \"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e\") " Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.704046 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q" (OuterVolumeSpecName: "kube-api-access-5j98q") pod "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" (UID: "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e"). InnerVolumeSpecName "kube-api-access-5j98q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.715769 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory" (OuterVolumeSpecName: "inventory") pod "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" (UID: "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.722331 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" (UID: "7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.780454 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.780491 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j98q\" (UniqueName: \"kubernetes.io/projected/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-kube-api-access-5j98q\") on node \"crc\" DevicePath \"\"" Oct 09 14:19:49 crc kubenswrapper[4902]: I1009 14:19:49.780502 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:19:50 crc kubenswrapper[4902]: I1009 14:19:50.123474 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" event={"ID":"7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e","Type":"ContainerDied","Data":"d47ff5a69bdaf7d641d98852ce3a434fb97a7a74c0be8a2dcaffab1275dc513b"} Oct 09 14:19:50 crc kubenswrapper[4902]: I1009 14:19:50.123522 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47ff5a69bdaf7d641d98852ce3a434fb97a7a74c0be8a2dcaffab1275dc513b" Oct 09 14:19:50 crc kubenswrapper[4902]: I1009 14:19:50.123584 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.027689 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h"] Oct 09 14:19:57 crc kubenswrapper[4902]: E1009 14:19:57.030141 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.030306 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.031024 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.031816 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.034127 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.034253 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.035008 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.036173 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.037906 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h"] Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.116641 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtc5w\" (UniqueName: \"kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.116691 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.117139 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.218964 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.219457 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtc5w\" (UniqueName: \"kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.219493 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.225577 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.225787 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.236174 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtc5w\" (UniqueName: \"kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.354151 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.513048 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:19:57 crc kubenswrapper[4902]: E1009 14:19:57.513526 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:19:57 crc kubenswrapper[4902]: I1009 14:19:57.878131 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h"] Oct 09 14:19:58 crc kubenswrapper[4902]: I1009 14:19:58.185979 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" event={"ID":"77b43a36-9858-4efc-aa2b-a56278710389","Type":"ContainerStarted","Data":"00681937f538aed215043ea9ea3f0ddd5cc237a067ca949600a9ae6079e0cb29"} Oct 09 14:19:59 crc kubenswrapper[4902]: I1009 14:19:59.198332 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" event={"ID":"77b43a36-9858-4efc-aa2b-a56278710389","Type":"ContainerStarted","Data":"20488670451c000659d03dbfdc959f75d9a3bbda2f432f06b601e7fca2b3f458"} Oct 09 14:19:59 crc kubenswrapper[4902]: I1009 14:19:59.228096 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" podStartSLOduration=1.76767284 podStartE2EDuration="2.228074188s" podCreationTimestamp="2025-10-09 14:19:57 +0000 UTC" firstStartedPulling="2025-10-09 14:19:57.884405095 +0000 UTC m=+1745.082264159" lastFinishedPulling="2025-10-09 14:19:58.344806443 +0000 UTC m=+1745.542665507" observedRunningTime="2025-10-09 14:19:59.21862196 +0000 UTC m=+1746.416481044" watchObservedRunningTime="2025-10-09 14:19:59.228074188 +0000 UTC m=+1746.425933252" Oct 09 14:20:04 crc kubenswrapper[4902]: I1009 14:20:04.376633 4902 scope.go:117] "RemoveContainer" containerID="c282e06076c15d994c5781dcec81200243f7a4c9430b5342b27458e69a167e60" Oct 09 14:20:11 crc kubenswrapper[4902]: I1009 14:20:11.512675 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:20:11 crc kubenswrapper[4902]: E1009 14:20:11.513363 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:20:26 crc kubenswrapper[4902]: I1009 14:20:26.512734 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:20:26 crc kubenswrapper[4902]: E1009 14:20:26.514503 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:20:39 crc kubenswrapper[4902]: I1009 14:20:39.514051 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:20:39 crc kubenswrapper[4902]: E1009 14:20:39.514897 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:20:43 crc kubenswrapper[4902]: I1009 14:20:43.598776 4902 generic.go:334] "Generic (PLEG): container finished" podID="77b43a36-9858-4efc-aa2b-a56278710389" containerID="20488670451c000659d03dbfdc959f75d9a3bbda2f432f06b601e7fca2b3f458" exitCode=0 Oct 09 14:20:43 crc kubenswrapper[4902]: I1009 14:20:43.598895 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" event={"ID":"77b43a36-9858-4efc-aa2b-a56278710389","Type":"ContainerDied","Data":"20488670451c000659d03dbfdc959f75d9a3bbda2f432f06b601e7fca2b3f458"} Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.048293 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.159674 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key\") pod \"77b43a36-9858-4efc-aa2b-a56278710389\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.159893 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory\") pod \"77b43a36-9858-4efc-aa2b-a56278710389\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.159927 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtc5w\" (UniqueName: \"kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w\") pod \"77b43a36-9858-4efc-aa2b-a56278710389\" (UID: \"77b43a36-9858-4efc-aa2b-a56278710389\") " Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.167374 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w" (OuterVolumeSpecName: "kube-api-access-rtc5w") pod "77b43a36-9858-4efc-aa2b-a56278710389" (UID: "77b43a36-9858-4efc-aa2b-a56278710389"). InnerVolumeSpecName "kube-api-access-rtc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.190682 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "77b43a36-9858-4efc-aa2b-a56278710389" (UID: "77b43a36-9858-4efc-aa2b-a56278710389"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.191793 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory" (OuterVolumeSpecName: "inventory") pod "77b43a36-9858-4efc-aa2b-a56278710389" (UID: "77b43a36-9858-4efc-aa2b-a56278710389"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.262071 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.262122 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtc5w\" (UniqueName: \"kubernetes.io/projected/77b43a36-9858-4efc-aa2b-a56278710389-kube-api-access-rtc5w\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.262138 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/77b43a36-9858-4efc-aa2b-a56278710389-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.616451 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" event={"ID":"77b43a36-9858-4efc-aa2b-a56278710389","Type":"ContainerDied","Data":"00681937f538aed215043ea9ea3f0ddd5cc237a067ca949600a9ae6079e0cb29"} Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.616503 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00681937f538aed215043ea9ea3f0ddd5cc237a067ca949600a9ae6079e0cb29" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.616533 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.718005 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5kpf"] Oct 09 14:20:45 crc kubenswrapper[4902]: E1009 14:20:45.718799 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b43a36-9858-4efc-aa2b-a56278710389" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.718822 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b43a36-9858-4efc-aa2b-a56278710389" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.719106 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b43a36-9858-4efc-aa2b-a56278710389" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.719785 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.721626 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.722007 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.723850 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.735239 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.776355 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5kpf"] Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.875421 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.875500 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.875781 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4scd\" (UniqueName: \"kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.978482 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4scd\" (UniqueName: \"kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.978667 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.978716 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.982656 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.982853 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:45 crc kubenswrapper[4902]: I1009 14:20:45.994349 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4scd\" (UniqueName: \"kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd\") pod \"ssh-known-hosts-edpm-deployment-q5kpf\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:46 crc kubenswrapper[4902]: I1009 14:20:46.035764 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:46 crc kubenswrapper[4902]: I1009 14:20:46.527172 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q5kpf"] Oct 09 14:20:46 crc kubenswrapper[4902]: I1009 14:20:46.630284 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" event={"ID":"582550a8-6e34-4a07-97af-70e3770fedcd","Type":"ContainerStarted","Data":"c90af9ee265508b48d0d033578ccfd12561f52d01df6694d58f3d193d3cf56ab"} Oct 09 14:20:47 crc kubenswrapper[4902]: I1009 14:20:47.641569 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" event={"ID":"582550a8-6e34-4a07-97af-70e3770fedcd","Type":"ContainerStarted","Data":"cae5731ae24764f35bdf73df37006abb35068e919eb371af9f756b37e04511c3"} Oct 09 14:20:47 crc kubenswrapper[4902]: I1009 14:20:47.662082 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" podStartSLOduration=2.20400003 podStartE2EDuration="2.662064825s" podCreationTimestamp="2025-10-09 14:20:45 +0000 UTC" firstStartedPulling="2025-10-09 14:20:46.534719486 +0000 UTC m=+1793.732578550" lastFinishedPulling="2025-10-09 14:20:46.992784281 +0000 UTC m=+1794.190643345" observedRunningTime="2025-10-09 14:20:47.654243579 +0000 UTC m=+1794.852102643" watchObservedRunningTime="2025-10-09 14:20:47.662064825 +0000 UTC m=+1794.859923889" Oct 09 14:20:51 crc kubenswrapper[4902]: I1009 14:20:51.513543 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:20:51 crc kubenswrapper[4902]: E1009 14:20:51.514672 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:20:54 crc kubenswrapper[4902]: I1009 14:20:54.697172 4902 generic.go:334] "Generic (PLEG): container finished" podID="582550a8-6e34-4a07-97af-70e3770fedcd" containerID="cae5731ae24764f35bdf73df37006abb35068e919eb371af9f756b37e04511c3" exitCode=0 Oct 09 14:20:54 crc kubenswrapper[4902]: I1009 14:20:54.697256 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" event={"ID":"582550a8-6e34-4a07-97af-70e3770fedcd","Type":"ContainerDied","Data":"cae5731ae24764f35bdf73df37006abb35068e919eb371af9f756b37e04511c3"} Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.098939 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.187857 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4scd\" (UniqueName: \"kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd\") pod \"582550a8-6e34-4a07-97af-70e3770fedcd\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.187944 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam\") pod \"582550a8-6e34-4a07-97af-70e3770fedcd\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.188054 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0\") pod \"582550a8-6e34-4a07-97af-70e3770fedcd\" (UID: \"582550a8-6e34-4a07-97af-70e3770fedcd\") " Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.194962 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd" (OuterVolumeSpecName: "kube-api-access-s4scd") pod "582550a8-6e34-4a07-97af-70e3770fedcd" (UID: "582550a8-6e34-4a07-97af-70e3770fedcd"). InnerVolumeSpecName "kube-api-access-s4scd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.220203 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "582550a8-6e34-4a07-97af-70e3770fedcd" (UID: "582550a8-6e34-4a07-97af-70e3770fedcd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.220804 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "582550a8-6e34-4a07-97af-70e3770fedcd" (UID: "582550a8-6e34-4a07-97af-70e3770fedcd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.290532 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4scd\" (UniqueName: \"kubernetes.io/projected/582550a8-6e34-4a07-97af-70e3770fedcd-kube-api-access-s4scd\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.290569 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.290581 4902 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/582550a8-6e34-4a07-97af-70e3770fedcd-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.714317 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" event={"ID":"582550a8-6e34-4a07-97af-70e3770fedcd","Type":"ContainerDied","Data":"c90af9ee265508b48d0d033578ccfd12561f52d01df6694d58f3d193d3cf56ab"} Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.714360 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90af9ee265508b48d0d033578ccfd12561f52d01df6694d58f3d193d3cf56ab" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.714427 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q5kpf" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.786063 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m"] Oct 09 14:20:56 crc kubenswrapper[4902]: E1009 14:20:56.786806 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582550a8-6e34-4a07-97af-70e3770fedcd" containerName="ssh-known-hosts-edpm-deployment" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.786832 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="582550a8-6e34-4a07-97af-70e3770fedcd" containerName="ssh-known-hosts-edpm-deployment" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.787069 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="582550a8-6e34-4a07-97af-70e3770fedcd" containerName="ssh-known-hosts-edpm-deployment" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.787893 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.790612 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.790713 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.791114 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.791380 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.796893 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m"] Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.902964 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.903071 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:56 crc kubenswrapper[4902]: I1009 14:20:56.903133 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4mcz\" (UniqueName: \"kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.004905 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.004985 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.005037 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4mcz\" (UniqueName: \"kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.013162 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.015445 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.024401 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4mcz\" (UniqueName: \"kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-dss8m\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.105927 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.610773 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m"] Oct 09 14:20:57 crc kubenswrapper[4902]: I1009 14:20:57.723992 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" event={"ID":"21ce0f21-ea6d-4bed-b68b-2573a40c5443","Type":"ContainerStarted","Data":"c371b4a765219c46c6c87ac177500d154b11be06dc857d3380f3a9377726a75e"} Oct 09 14:20:58 crc kubenswrapper[4902]: I1009 14:20:58.733034 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" event={"ID":"21ce0f21-ea6d-4bed-b68b-2573a40c5443","Type":"ContainerStarted","Data":"b4c41454a95b8bf44b609bc6682f0a953d79b89d39331719f27fff48b6d9dacc"} Oct 09 14:20:58 crc kubenswrapper[4902]: I1009 14:20:58.759054 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" podStartSLOduration=2.140922343 podStartE2EDuration="2.759036306s" podCreationTimestamp="2025-10-09 14:20:56 +0000 UTC" firstStartedPulling="2025-10-09 14:20:57.625859085 +0000 UTC m=+1804.823718149" lastFinishedPulling="2025-10-09 14:20:58.243973048 +0000 UTC m=+1805.441832112" observedRunningTime="2025-10-09 14:20:58.750384707 +0000 UTC m=+1805.948243781" watchObservedRunningTime="2025-10-09 14:20:58.759036306 +0000 UTC m=+1805.956895370" Oct 09 14:21:04 crc kubenswrapper[4902]: I1009 14:21:04.512652 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:21:04 crc kubenswrapper[4902]: E1009 14:21:04.513612 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:21:06 crc kubenswrapper[4902]: I1009 14:21:06.801382 4902 generic.go:334] "Generic (PLEG): container finished" podID="21ce0f21-ea6d-4bed-b68b-2573a40c5443" containerID="b4c41454a95b8bf44b609bc6682f0a953d79b89d39331719f27fff48b6d9dacc" exitCode=0 Oct 09 14:21:06 crc kubenswrapper[4902]: I1009 14:21:06.801477 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" event={"ID":"21ce0f21-ea6d-4bed-b68b-2573a40c5443","Type":"ContainerDied","Data":"b4c41454a95b8bf44b609bc6682f0a953d79b89d39331719f27fff48b6d9dacc"} Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.286345 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.317278 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key\") pod \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.317357 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory\") pod \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.317407 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4mcz\" (UniqueName: \"kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz\") pod \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\" (UID: \"21ce0f21-ea6d-4bed-b68b-2573a40c5443\") " Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.323754 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz" (OuterVolumeSpecName: "kube-api-access-h4mcz") pod "21ce0f21-ea6d-4bed-b68b-2573a40c5443" (UID: "21ce0f21-ea6d-4bed-b68b-2573a40c5443"). InnerVolumeSpecName "kube-api-access-h4mcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.356550 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21ce0f21-ea6d-4bed-b68b-2573a40c5443" (UID: "21ce0f21-ea6d-4bed-b68b-2573a40c5443"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.360032 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory" (OuterVolumeSpecName: "inventory") pod "21ce0f21-ea6d-4bed-b68b-2573a40c5443" (UID: "21ce0f21-ea6d-4bed-b68b-2573a40c5443"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.419674 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.419708 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21ce0f21-ea6d-4bed-b68b-2573a40c5443-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.419721 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4mcz\" (UniqueName: \"kubernetes.io/projected/21ce0f21-ea6d-4bed-b68b-2573a40c5443-kube-api-access-h4mcz\") on node \"crc\" DevicePath \"\"" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.823062 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" event={"ID":"21ce0f21-ea6d-4bed-b68b-2573a40c5443","Type":"ContainerDied","Data":"c371b4a765219c46c6c87ac177500d154b11be06dc857d3380f3a9377726a75e"} Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.823114 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c371b4a765219c46c6c87ac177500d154b11be06dc857d3380f3a9377726a75e" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.823205 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-dss8m" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.913268 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z"] Oct 09 14:21:08 crc kubenswrapper[4902]: E1009 14:21:08.914133 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce0f21-ea6d-4bed-b68b-2573a40c5443" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.914161 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce0f21-ea6d-4bed-b68b-2573a40c5443" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.914440 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce0f21-ea6d-4bed-b68b-2573a40c5443" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.915237 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.919763 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.919991 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.920069 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.920185 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.927243 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.927367 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.927455 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77x5c\" (UniqueName: \"kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:08 crc kubenswrapper[4902]: I1009 14:21:08.931065 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z"] Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.030073 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.030183 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.030221 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77x5c\" (UniqueName: \"kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.038469 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.041571 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.055242 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77x5c\" (UniqueName: \"kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.250676 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.809948 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z"] Oct 09 14:21:09 crc kubenswrapper[4902]: I1009 14:21:09.834628 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" event={"ID":"f3d3271d-29ab-4339-8614-a297a2b8791f","Type":"ContainerStarted","Data":"244be3191b1a5f0bbda063357b2feed2f3148ddfefff955f2ad90be85f63d800"} Oct 09 14:21:10 crc kubenswrapper[4902]: I1009 14:21:10.845556 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" event={"ID":"f3d3271d-29ab-4339-8614-a297a2b8791f","Type":"ContainerStarted","Data":"702ce6eb5c9966257a3eafa5dd8a1afdf84d8fc6c0b3eef0bc0416124edeef71"} Oct 09 14:21:10 crc kubenswrapper[4902]: I1009 14:21:10.865872 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" podStartSLOduration=2.1003816730000002 podStartE2EDuration="2.865854373s" podCreationTimestamp="2025-10-09 14:21:08 +0000 UTC" firstStartedPulling="2025-10-09 14:21:09.816128496 +0000 UTC m=+1817.013987560" lastFinishedPulling="2025-10-09 14:21:10.581601206 +0000 UTC m=+1817.779460260" observedRunningTime="2025-10-09 14:21:10.862814609 +0000 UTC m=+1818.060673683" watchObservedRunningTime="2025-10-09 14:21:10.865854373 +0000 UTC m=+1818.063713437" Oct 09 14:21:15 crc kubenswrapper[4902]: I1009 14:21:15.513629 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:21:15 crc kubenswrapper[4902]: E1009 14:21:15.514638 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:21:28 crc kubenswrapper[4902]: I1009 14:21:28.513614 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:21:29 crc kubenswrapper[4902]: I1009 14:21:29.002080 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e"} Oct 09 14:22:26 crc kubenswrapper[4902]: I1009 14:22:26.490609 4902 generic.go:334] "Generic (PLEG): container finished" podID="f3d3271d-29ab-4339-8614-a297a2b8791f" containerID="702ce6eb5c9966257a3eafa5dd8a1afdf84d8fc6c0b3eef0bc0416124edeef71" exitCode=0 Oct 09 14:22:26 crc kubenswrapper[4902]: I1009 14:22:26.490675 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" event={"ID":"f3d3271d-29ab-4339-8614-a297a2b8791f","Type":"ContainerDied","Data":"702ce6eb5c9966257a3eafa5dd8a1afdf84d8fc6c0b3eef0bc0416124edeef71"} Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.878876 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.965169 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77x5c\" (UniqueName: \"kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c\") pod \"f3d3271d-29ab-4339-8614-a297a2b8791f\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.965647 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key\") pod \"f3d3271d-29ab-4339-8614-a297a2b8791f\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.965714 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory\") pod \"f3d3271d-29ab-4339-8614-a297a2b8791f\" (UID: \"f3d3271d-29ab-4339-8614-a297a2b8791f\") " Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.973672 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c" (OuterVolumeSpecName: "kube-api-access-77x5c") pod "f3d3271d-29ab-4339-8614-a297a2b8791f" (UID: "f3d3271d-29ab-4339-8614-a297a2b8791f"). InnerVolumeSpecName "kube-api-access-77x5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:22:27 crc kubenswrapper[4902]: I1009 14:22:27.996596 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3d3271d-29ab-4339-8614-a297a2b8791f" (UID: "f3d3271d-29ab-4339-8614-a297a2b8791f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.014386 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory" (OuterVolumeSpecName: "inventory") pod "f3d3271d-29ab-4339-8614-a297a2b8791f" (UID: "f3d3271d-29ab-4339-8614-a297a2b8791f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.068608 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77x5c\" (UniqueName: \"kubernetes.io/projected/f3d3271d-29ab-4339-8614-a297a2b8791f-kube-api-access-77x5c\") on node \"crc\" DevicePath \"\"" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.068656 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.068668 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d3271d-29ab-4339-8614-a297a2b8791f-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.509570 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" event={"ID":"f3d3271d-29ab-4339-8614-a297a2b8791f","Type":"ContainerDied","Data":"244be3191b1a5f0bbda063357b2feed2f3148ddfefff955f2ad90be85f63d800"} Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.509611 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244be3191b1a5f0bbda063357b2feed2f3148ddfefff955f2ad90be85f63d800" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.509611 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.608758 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr"] Oct 09 14:22:28 crc kubenswrapper[4902]: E1009 14:22:28.609631 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d3271d-29ab-4339-8614-a297a2b8791f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.609655 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d3271d-29ab-4339-8614-a297a2b8791f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.609880 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d3271d-29ab-4339-8614-a297a2b8791f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.610763 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.618504 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr"] Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.639841 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.640233 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.640462 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.640636 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.640845 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.640991 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.641176 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.644570 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679586 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679638 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679665 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679686 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679866 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.679899 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680062 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680170 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680226 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680270 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67km6\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680478 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680646 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.680701 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.681050 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783338 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783459 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783487 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783518 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783553 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783585 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.783611 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784170 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784217 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784256 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784290 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784474 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784515 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67km6\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.784577 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.788502 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.788684 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.789154 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.789734 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.790023 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.790337 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.790616 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.791918 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.791974 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.792634 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.792688 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.795020 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.799978 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.803928 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67km6\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:28 crc kubenswrapper[4902]: I1009 14:22:28.968865 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:22:29 crc kubenswrapper[4902]: I1009 14:22:29.472051 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr"] Oct 09 14:22:29 crc kubenswrapper[4902]: I1009 14:22:29.483157 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:22:29 crc kubenswrapper[4902]: I1009 14:22:29.522692 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" event={"ID":"813f74d2-a7a6-4e97-983b-544c38995262","Type":"ContainerStarted","Data":"3cecf6d154498c6dd32f605be711484cd9675d8e3e121c41160274c9496e34b9"} Oct 09 14:22:30 crc kubenswrapper[4902]: I1009 14:22:30.530123 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" event={"ID":"813f74d2-a7a6-4e97-983b-544c38995262","Type":"ContainerStarted","Data":"c56d956b13324354255894278b25e9cbb282b045cb4e50509aca831d1b280e2b"} Oct 09 14:22:30 crc kubenswrapper[4902]: I1009 14:22:30.557855 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" podStartSLOduration=2.126839716 podStartE2EDuration="2.557841125s" podCreationTimestamp="2025-10-09 14:22:28 +0000 UTC" firstStartedPulling="2025-10-09 14:22:29.482903263 +0000 UTC m=+1896.680762327" lastFinishedPulling="2025-10-09 14:22:29.913904662 +0000 UTC m=+1897.111763736" observedRunningTime="2025-10-09 14:22:30.553993462 +0000 UTC m=+1897.751852516" watchObservedRunningTime="2025-10-09 14:22:30.557841125 +0000 UTC m=+1897.755700189" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.453958 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.457002 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.463078 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.564629 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.564686 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fct74\" (UniqueName: \"kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.565090 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.666919 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.667009 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.667051 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fct74\" (UniqueName: \"kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.667588 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.667626 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.685950 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fct74\" (UniqueName: \"kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74\") pod \"community-operators-4gqnp\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:22:59 crc kubenswrapper[4902]: I1009 14:22:59.784081 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:00 crc kubenswrapper[4902]: I1009 14:23:00.279471 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:23:00 crc kubenswrapper[4902]: I1009 14:23:00.793193 4902 generic.go:334] "Generic (PLEG): container finished" podID="50013476-0da3-4298-9d39-e309166c9914" containerID="b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6" exitCode=0 Oct 09 14:23:00 crc kubenswrapper[4902]: I1009 14:23:00.793250 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerDied","Data":"b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6"} Oct 09 14:23:00 crc kubenswrapper[4902]: I1009 14:23:00.793279 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerStarted","Data":"3f1e897e6fa6b943983e4b57d2ff5ca3ba729918ffbaa5404145b636b4c2e8e9"} Oct 09 14:23:01 crc kubenswrapper[4902]: I1009 14:23:01.803601 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerStarted","Data":"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109"} Oct 09 14:23:02 crc kubenswrapper[4902]: I1009 14:23:02.818220 4902 generic.go:334] "Generic (PLEG): container finished" podID="50013476-0da3-4298-9d39-e309166c9914" containerID="0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109" exitCode=0 Oct 09 14:23:02 crc kubenswrapper[4902]: I1009 14:23:02.818314 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerDied","Data":"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109"} Oct 09 14:23:03 crc kubenswrapper[4902]: I1009 14:23:03.832598 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerStarted","Data":"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf"} Oct 09 14:23:03 crc kubenswrapper[4902]: I1009 14:23:03.858260 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4gqnp" podStartSLOduration=2.229412415 podStartE2EDuration="4.858238214s" podCreationTimestamp="2025-10-09 14:22:59 +0000 UTC" firstStartedPulling="2025-10-09 14:23:00.795868102 +0000 UTC m=+1927.993727166" lastFinishedPulling="2025-10-09 14:23:03.424693851 +0000 UTC m=+1930.622552965" observedRunningTime="2025-10-09 14:23:03.850819346 +0000 UTC m=+1931.048678410" watchObservedRunningTime="2025-10-09 14:23:03.858238214 +0000 UTC m=+1931.056097278" Oct 09 14:23:08 crc kubenswrapper[4902]: E1009 14:23:08.235305 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod813f74d2_a7a6_4e97_983b_544c38995262.slice/crio-conmon-c56d956b13324354255894278b25e9cbb282b045cb4e50509aca831d1b280e2b.scope\": RecentStats: unable to find data in memory cache]" Oct 09 14:23:08 crc kubenswrapper[4902]: I1009 14:23:08.883265 4902 generic.go:334] "Generic (PLEG): container finished" podID="813f74d2-a7a6-4e97-983b-544c38995262" containerID="c56d956b13324354255894278b25e9cbb282b045cb4e50509aca831d1b280e2b" exitCode=0 Oct 09 14:23:08 crc kubenswrapper[4902]: I1009 14:23:08.883385 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" event={"ID":"813f74d2-a7a6-4e97-983b-544c38995262","Type":"ContainerDied","Data":"c56d956b13324354255894278b25e9cbb282b045cb4e50509aca831d1b280e2b"} Oct 09 14:23:09 crc kubenswrapper[4902]: I1009 14:23:09.785758 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:09 crc kubenswrapper[4902]: I1009 14:23:09.785811 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:09 crc kubenswrapper[4902]: I1009 14:23:09.832838 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:09 crc kubenswrapper[4902]: I1009 14:23:09.943286 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.067049 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.279804 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382676 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382723 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382762 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382780 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382827 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67km6\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382861 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382913 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382931 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.382981 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.383003 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.383028 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.383065 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.383107 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.383124 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle\") pod \"813f74d2-a7a6-4e97-983b-544c38995262\" (UID: \"813f74d2-a7a6-4e97-983b-544c38995262\") " Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.389699 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.391364 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.391557 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.392153 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6" (OuterVolumeSpecName: "kube-api-access-67km6") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "kube-api-access-67km6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.392247 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.394342 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.394674 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.395212 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.395610 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.395664 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.396117 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.397216 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.418914 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory" (OuterVolumeSpecName: "inventory") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.420463 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "813f74d2-a7a6-4e97-983b-544c38995262" (UID: "813f74d2-a7a6-4e97-983b-544c38995262"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485315 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485382 4902 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485395 4902 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485426 4902 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485442 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485455 4902 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485464 4902 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485476 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485484 4902 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485494 4902 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485508 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67km6\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-kube-api-access-67km6\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485519 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485531 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/813f74d2-a7a6-4e97-983b-544c38995262-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.485543 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/813f74d2-a7a6-4e97-983b-544c38995262-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.907759 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.907835 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr" event={"ID":"813f74d2-a7a6-4e97-983b-544c38995262","Type":"ContainerDied","Data":"3cecf6d154498c6dd32f605be711484cd9675d8e3e121c41160274c9496e34b9"} Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.908784 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cecf6d154498c6dd32f605be711484cd9675d8e3e121c41160274c9496e34b9" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.991423 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8"] Oct 09 14:23:10 crc kubenswrapper[4902]: E1009 14:23:10.992177 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813f74d2-a7a6-4e97-983b-544c38995262" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.992204 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="813f74d2-a7a6-4e97-983b-544c38995262" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.992451 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="813f74d2-a7a6-4e97-983b-544c38995262" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.993110 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.995511 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.995650 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.996361 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.997085 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 09 14:23:10 crc kubenswrapper[4902]: I1009 14:23:10.997688 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.006034 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8"] Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.098512 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.098574 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.098672 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.098726 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtvl2\" (UniqueName: \"kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.098761 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.200196 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.200269 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.200321 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.200380 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvl2\" (UniqueName: \"kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.200445 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.201869 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.205936 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.206428 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.207452 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.224048 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvl2\" (UniqueName: \"kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-ndvq8\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.317599 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.808577 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8"] Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.917725 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" event={"ID":"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed","Type":"ContainerStarted","Data":"21e4954491c85a9772dd5ad405f38aefd23d3734978feb76625e465baf821451"} Oct 09 14:23:11 crc kubenswrapper[4902]: I1009 14:23:11.917929 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4gqnp" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="registry-server" containerID="cri-o://ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf" gracePeriod=2 Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.309837 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.322482 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content\") pod \"50013476-0da3-4298-9d39-e309166c9914\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.322837 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fct74\" (UniqueName: \"kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74\") pod \"50013476-0da3-4298-9d39-e309166c9914\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.322948 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities\") pod \"50013476-0da3-4298-9d39-e309166c9914\" (UID: \"50013476-0da3-4298-9d39-e309166c9914\") " Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.323769 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities" (OuterVolumeSpecName: "utilities") pod "50013476-0da3-4298-9d39-e309166c9914" (UID: "50013476-0da3-4298-9d39-e309166c9914"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.330667 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74" (OuterVolumeSpecName: "kube-api-access-fct74") pod "50013476-0da3-4298-9d39-e309166c9914" (UID: "50013476-0da3-4298-9d39-e309166c9914"). InnerVolumeSpecName "kube-api-access-fct74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.392950 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50013476-0da3-4298-9d39-e309166c9914" (UID: "50013476-0da3-4298-9d39-e309166c9914"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.426296 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.426365 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fct74\" (UniqueName: \"kubernetes.io/projected/50013476-0da3-4298-9d39-e309166c9914-kube-api-access-fct74\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.426379 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50013476-0da3-4298-9d39-e309166c9914-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.930269 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" event={"ID":"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed","Type":"ContainerStarted","Data":"bc3449f9fe6e17e0f3f8e41f6e302487ad30d8e9297d29f16b141b0c88a719cd"} Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.933837 4902 generic.go:334] "Generic (PLEG): container finished" podID="50013476-0da3-4298-9d39-e309166c9914" containerID="ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf" exitCode=0 Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.933971 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerDied","Data":"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf"} Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.934042 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4gqnp" event={"ID":"50013476-0da3-4298-9d39-e309166c9914","Type":"ContainerDied","Data":"3f1e897e6fa6b943983e4b57d2ff5ca3ba729918ffbaa5404145b636b4c2e8e9"} Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.934140 4902 scope.go:117] "RemoveContainer" containerID="ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.934351 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4gqnp" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.963960 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" podStartSLOduration=2.29103056 podStartE2EDuration="2.963939292s" podCreationTimestamp="2025-10-09 14:23:10 +0000 UTC" firstStartedPulling="2025-10-09 14:23:11.813442955 +0000 UTC m=+1939.011302019" lastFinishedPulling="2025-10-09 14:23:12.486351687 +0000 UTC m=+1939.684210751" observedRunningTime="2025-10-09 14:23:12.961319956 +0000 UTC m=+1940.159179040" watchObservedRunningTime="2025-10-09 14:23:12.963939292 +0000 UTC m=+1940.161798356" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.988118 4902 scope.go:117] "RemoveContainer" containerID="0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109" Oct 09 14:23:12 crc kubenswrapper[4902]: I1009 14:23:12.995046 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.004205 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4gqnp"] Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.010565 4902 scope.go:117] "RemoveContainer" containerID="b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.031054 4902 scope.go:117] "RemoveContainer" containerID="ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf" Oct 09 14:23:13 crc kubenswrapper[4902]: E1009 14:23:13.031467 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf\": container with ID starting with ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf not found: ID does not exist" containerID="ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.031581 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf"} err="failed to get container status \"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf\": rpc error: code = NotFound desc = could not find container \"ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf\": container with ID starting with ddd925288b51b1d34885140d0f32c23d0fa17461ba3bf9edb881d724a454cfdf not found: ID does not exist" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.031705 4902 scope.go:117] "RemoveContainer" containerID="0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109" Oct 09 14:23:13 crc kubenswrapper[4902]: E1009 14:23:13.032077 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109\": container with ID starting with 0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109 not found: ID does not exist" containerID="0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.032121 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109"} err="failed to get container status \"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109\": rpc error: code = NotFound desc = could not find container \"0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109\": container with ID starting with 0c0b1b5f7aca44a6d071828a6e2fb9e5d727326ee7c098b2aafb7ad59c071109 not found: ID does not exist" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.032151 4902 scope.go:117] "RemoveContainer" containerID="b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6" Oct 09 14:23:13 crc kubenswrapper[4902]: E1009 14:23:13.032537 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6\": container with ID starting with b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6 not found: ID does not exist" containerID="b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.032619 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6"} err="failed to get container status \"b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6\": rpc error: code = NotFound desc = could not find container \"b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6\": container with ID starting with b58cd46afefb043a90c598d63d1ca50c54cffa02f6e2cfe759457726dc9bb9c6 not found: ID does not exist" Oct 09 14:23:13 crc kubenswrapper[4902]: I1009 14:23:13.532866 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50013476-0da3-4298-9d39-e309166c9914" path="/var/lib/kubelet/pods/50013476-0da3-4298-9d39-e309166c9914/volumes" Oct 09 14:23:50 crc kubenswrapper[4902]: I1009 14:23:50.078328 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:23:50 crc kubenswrapper[4902]: I1009 14:23:50.078788 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:24:17 crc kubenswrapper[4902]: I1009 14:24:17.511571 4902 generic.go:334] "Generic (PLEG): container finished" podID="9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" containerID="bc3449f9fe6e17e0f3f8e41f6e302487ad30d8e9297d29f16b141b0c88a719cd" exitCode=0 Oct 09 14:24:17 crc kubenswrapper[4902]: I1009 14:24:17.511654 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" event={"ID":"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed","Type":"ContainerDied","Data":"bc3449f9fe6e17e0f3f8e41f6e302487ad30d8e9297d29f16b141b0c88a719cd"} Oct 09 14:24:18 crc kubenswrapper[4902]: I1009 14:24:18.886158 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.040691 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key\") pod \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.040767 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0\") pod \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.040814 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle\") pod \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.041695 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory\") pod \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.041749 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtvl2\" (UniqueName: \"kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2\") pod \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\" (UID: \"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed\") " Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.051648 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" (UID: "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.051673 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2" (OuterVolumeSpecName: "kube-api-access-dtvl2") pod "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" (UID: "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed"). InnerVolumeSpecName "kube-api-access-dtvl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.069810 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" (UID: "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.071615 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory" (OuterVolumeSpecName: "inventory") pod "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" (UID: "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.072246 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" (UID: "9478bb2f-ce46-41f9-bfbd-e93ebcb437ed"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.144315 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.144353 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtvl2\" (UniqueName: \"kubernetes.io/projected/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-kube-api-access-dtvl2\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.144363 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.144372 4902 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.144381 4902 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9478bb2f-ce46-41f9-bfbd-e93ebcb437ed-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.529431 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" event={"ID":"9478bb2f-ce46-41f9-bfbd-e93ebcb437ed","Type":"ContainerDied","Data":"21e4954491c85a9772dd5ad405f38aefd23d3734978feb76625e465baf821451"} Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.529969 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e4954491c85a9772dd5ad405f38aefd23d3734978feb76625e465baf821451" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.529529 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-ndvq8" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.635060 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n"] Oct 09 14:24:19 crc kubenswrapper[4902]: E1009 14:24:19.636080 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="extract-utilities" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636122 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="extract-utilities" Oct 09 14:24:19 crc kubenswrapper[4902]: E1009 14:24:19.636142 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="registry-server" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636150 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="registry-server" Oct 09 14:24:19 crc kubenswrapper[4902]: E1009 14:24:19.636168 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="extract-content" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636175 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="extract-content" Oct 09 14:24:19 crc kubenswrapper[4902]: E1009 14:24:19.636196 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636204 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636535 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="50013476-0da3-4298-9d39-e309166c9914" containerName="registry-server" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.636563 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="9478bb2f-ce46-41f9-bfbd-e93ebcb437ed" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.637368 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.642565 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.642780 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.642929 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.643070 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.649349 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.658718 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n"] Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.660845 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.757988 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.758039 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7q7\" (UniqueName: \"kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.758061 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.758362 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.758926 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.759000 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860225 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860355 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860396 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860662 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860702 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7q7\" (UniqueName: \"kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.860721 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.865470 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.865534 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.875605 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.875873 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.876769 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.878609 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7q7\" (UniqueName: \"kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:19 crc kubenswrapper[4902]: I1009 14:24:19.969083 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:24:20 crc kubenswrapper[4902]: I1009 14:24:20.078231 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:24:20 crc kubenswrapper[4902]: I1009 14:24:20.078446 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:24:20 crc kubenswrapper[4902]: I1009 14:24:20.447623 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n"] Oct 09 14:24:20 crc kubenswrapper[4902]: I1009 14:24:20.538752 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" event={"ID":"1c9a61d2-e7f4-4e22-8b3b-18263b72df09","Type":"ContainerStarted","Data":"6ab4bd55dcef1293941f740b4eb1c265db274f45c20aa9efbec1a63260fa1d59"} Oct 09 14:24:21 crc kubenswrapper[4902]: I1009 14:24:21.548804 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" event={"ID":"1c9a61d2-e7f4-4e22-8b3b-18263b72df09","Type":"ContainerStarted","Data":"205dbf10db09a0e7bbf7ee9768e53da46779af066e564411af4e9a395f9424ce"} Oct 09 14:24:21 crc kubenswrapper[4902]: I1009 14:24:21.571632 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" podStartSLOduration=2.126974159 podStartE2EDuration="2.571615597s" podCreationTimestamp="2025-10-09 14:24:19 +0000 UTC" firstStartedPulling="2025-10-09 14:24:20.45429995 +0000 UTC m=+2007.652159014" lastFinishedPulling="2025-10-09 14:24:20.898941388 +0000 UTC m=+2008.096800452" observedRunningTime="2025-10-09 14:24:21.56795111 +0000 UTC m=+2008.765810184" watchObservedRunningTime="2025-10-09 14:24:21.571615597 +0000 UTC m=+2008.769474661" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.110279 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.113887 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.123740 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.284135 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v9lf\" (UniqueName: \"kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.284196 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.284216 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.386841 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v9lf\" (UniqueName: \"kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.386913 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.386939 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.387847 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.387907 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.417409 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v9lf\" (UniqueName: \"kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf\") pod \"redhat-marketplace-5f2nv\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:36 crc kubenswrapper[4902]: I1009 14:24:36.450754 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:37 crc kubenswrapper[4902]: I1009 14:24:37.006287 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:37 crc kubenswrapper[4902]: I1009 14:24:37.687202 4902 generic.go:334] "Generic (PLEG): container finished" podID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerID="892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220" exitCode=0 Oct 09 14:24:37 crc kubenswrapper[4902]: I1009 14:24:37.687248 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerDied","Data":"892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220"} Oct 09 14:24:37 crc kubenswrapper[4902]: I1009 14:24:37.687502 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerStarted","Data":"04ddda6647f83b1fd19b261a0c9233b539897b0a0ed29ade9bb97e3e561bdc34"} Oct 09 14:24:39 crc kubenswrapper[4902]: I1009 14:24:39.706530 4902 generic.go:334] "Generic (PLEG): container finished" podID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerID="43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361" exitCode=0 Oct 09 14:24:39 crc kubenswrapper[4902]: I1009 14:24:39.706615 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerDied","Data":"43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361"} Oct 09 14:24:40 crc kubenswrapper[4902]: I1009 14:24:40.719729 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerStarted","Data":"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e"} Oct 09 14:24:40 crc kubenswrapper[4902]: I1009 14:24:40.743496 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5f2nv" podStartSLOduration=2.3036422 podStartE2EDuration="4.74346811s" podCreationTimestamp="2025-10-09 14:24:36 +0000 UTC" firstStartedPulling="2025-10-09 14:24:37.688945231 +0000 UTC m=+2024.886804295" lastFinishedPulling="2025-10-09 14:24:40.128771141 +0000 UTC m=+2027.326630205" observedRunningTime="2025-10-09 14:24:40.739281308 +0000 UTC m=+2027.937140372" watchObservedRunningTime="2025-10-09 14:24:40.74346811 +0000 UTC m=+2027.941327184" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.662339 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.664706 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.674053 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.766916 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9stms\" (UniqueName: \"kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.767058 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.767123 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.869325 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.869426 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.869496 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9stms\" (UniqueName: \"kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.870084 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.870453 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.889239 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9stms\" (UniqueName: \"kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms\") pod \"redhat-operators-h6q7f\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:44 crc kubenswrapper[4902]: I1009 14:24:44.997922 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:45 crc kubenswrapper[4902]: I1009 14:24:45.540461 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:24:45 crc kubenswrapper[4902]: I1009 14:24:45.772592 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerStarted","Data":"2bfe976e6d1cd804f2a6646c5a210ec679eeafe01dcb7c9d45f0d2f115e7c0bc"} Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.451218 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.451271 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.498973 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.784755 4902 generic.go:334] "Generic (PLEG): container finished" podID="6f1e0561-a370-447d-b280-82e539671958" containerID="b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d" exitCode=0 Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.784880 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerDied","Data":"b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d"} Oct 09 14:24:46 crc kubenswrapper[4902]: I1009 14:24:46.838221 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:47 crc kubenswrapper[4902]: I1009 14:24:47.795225 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerStarted","Data":"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6"} Oct 09 14:24:48 crc kubenswrapper[4902]: I1009 14:24:48.835163 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:48 crc kubenswrapper[4902]: I1009 14:24:48.835801 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5f2nv" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="registry-server" containerID="cri-o://4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e" gracePeriod=2 Oct 09 14:24:49 crc kubenswrapper[4902]: I1009 14:24:49.813996 4902 generic.go:334] "Generic (PLEG): container finished" podID="6f1e0561-a370-447d-b280-82e539671958" containerID="00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6" exitCode=0 Oct 09 14:24:49 crc kubenswrapper[4902]: I1009 14:24:49.814069 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerDied","Data":"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.077769 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.078105 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.078156 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.078993 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.079119 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e" gracePeriod=600 Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.458515 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.608854 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities\") pod \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.609078 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v9lf\" (UniqueName: \"kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf\") pod \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.609110 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content\") pod \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\" (UID: \"babe331a-433c-4ef7-bc32-dfec9f1f48c0\") " Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.609678 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities" (OuterVolumeSpecName: "utilities") pod "babe331a-433c-4ef7-bc32-dfec9f1f48c0" (UID: "babe331a-433c-4ef7-bc32-dfec9f1f48c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.614439 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.615753 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf" (OuterVolumeSpecName: "kube-api-access-9v9lf") pod "babe331a-433c-4ef7-bc32-dfec9f1f48c0" (UID: "babe331a-433c-4ef7-bc32-dfec9f1f48c0"). InnerVolumeSpecName "kube-api-access-9v9lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.621262 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "babe331a-433c-4ef7-bc32-dfec9f1f48c0" (UID: "babe331a-433c-4ef7-bc32-dfec9f1f48c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.715820 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v9lf\" (UniqueName: \"kubernetes.io/projected/babe331a-433c-4ef7-bc32-dfec9f1f48c0-kube-api-access-9v9lf\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.715853 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/babe331a-433c-4ef7-bc32-dfec9f1f48c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.826061 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerStarted","Data":"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.829993 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e" exitCode=0 Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.830065 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.830122 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.830162 4902 scope.go:117] "RemoveContainer" containerID="03a73372b10bf2278335edb0134be424dfca37b6e17ab0c52bfb42dc57295c45" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.832982 4902 generic.go:334] "Generic (PLEG): container finished" podID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerID="4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e" exitCode=0 Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.833017 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerDied","Data":"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.833036 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5f2nv" event={"ID":"babe331a-433c-4ef7-bc32-dfec9f1f48c0","Type":"ContainerDied","Data":"04ddda6647f83b1fd19b261a0c9233b539897b0a0ed29ade9bb97e3e561bdc34"} Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.833135 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5f2nv" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.852471 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h6q7f" podStartSLOduration=3.322086414 podStartE2EDuration="6.8524552s" podCreationTimestamp="2025-10-09 14:24:44 +0000 UTC" firstStartedPulling="2025-10-09 14:24:46.787252989 +0000 UTC m=+2033.985112053" lastFinishedPulling="2025-10-09 14:24:50.317621775 +0000 UTC m=+2037.515480839" observedRunningTime="2025-10-09 14:24:50.850612276 +0000 UTC m=+2038.048471350" watchObservedRunningTime="2025-10-09 14:24:50.8524552 +0000 UTC m=+2038.050314264" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.865165 4902 scope.go:117] "RemoveContainer" containerID="4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.890645 4902 scope.go:117] "RemoveContainer" containerID="43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.900005 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.908307 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5f2nv"] Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.912326 4902 scope.go:117] "RemoveContainer" containerID="892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.933792 4902 scope.go:117] "RemoveContainer" containerID="4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e" Oct 09 14:24:50 crc kubenswrapper[4902]: E1009 14:24:50.934265 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e\": container with ID starting with 4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e not found: ID does not exist" containerID="4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.934307 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e"} err="failed to get container status \"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e\": rpc error: code = NotFound desc = could not find container \"4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e\": container with ID starting with 4961a5f8fd90cbfdc2e3b5085b2263e8c8d14fed85507776b22aaa33d221df1e not found: ID does not exist" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.934333 4902 scope.go:117] "RemoveContainer" containerID="43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361" Oct 09 14:24:50 crc kubenswrapper[4902]: E1009 14:24:50.934714 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361\": container with ID starting with 43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361 not found: ID does not exist" containerID="43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.934750 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361"} err="failed to get container status \"43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361\": rpc error: code = NotFound desc = could not find container \"43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361\": container with ID starting with 43a72974a578d58305506f978cae98f759b405c8a6db8b4200f8dc73d66ff361 not found: ID does not exist" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.934773 4902 scope.go:117] "RemoveContainer" containerID="892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220" Oct 09 14:24:50 crc kubenswrapper[4902]: E1009 14:24:50.935001 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220\": container with ID starting with 892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220 not found: ID does not exist" containerID="892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220" Oct 09 14:24:50 crc kubenswrapper[4902]: I1009 14:24:50.935028 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220"} err="failed to get container status \"892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220\": rpc error: code = NotFound desc = could not find container \"892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220\": container with ID starting with 892cba0ed0138b517a02b4a62619329cb245ae321459c7d6bd6d696b6f67e220 not found: ID does not exist" Oct 09 14:24:51 crc kubenswrapper[4902]: I1009 14:24:51.524100 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" path="/var/lib/kubelet/pods/babe331a-433c-4ef7-bc32-dfec9f1f48c0/volumes" Oct 09 14:24:54 crc kubenswrapper[4902]: I1009 14:24:54.998467 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:54 crc kubenswrapper[4902]: I1009 14:24:54.999211 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:24:56 crc kubenswrapper[4902]: I1009 14:24:56.051380 4902 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h6q7f" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="registry-server" probeResult="failure" output=< Oct 09 14:24:56 crc kubenswrapper[4902]: timeout: failed to connect service ":50051" within 1s Oct 09 14:24:56 crc kubenswrapper[4902]: > Oct 09 14:25:05 crc kubenswrapper[4902]: I1009 14:25:05.047125 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:25:05 crc kubenswrapper[4902]: I1009 14:25:05.099645 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:25:05 crc kubenswrapper[4902]: I1009 14:25:05.280330 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:25:06 crc kubenswrapper[4902]: I1009 14:25:06.975028 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h6q7f" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="registry-server" containerID="cri-o://ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a" gracePeriod=2 Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.442024 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.634307 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content\") pod \"6f1e0561-a370-447d-b280-82e539671958\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.634492 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities\") pod \"6f1e0561-a370-447d-b280-82e539671958\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.634875 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9stms\" (UniqueName: \"kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms\") pod \"6f1e0561-a370-447d-b280-82e539671958\" (UID: \"6f1e0561-a370-447d-b280-82e539671958\") " Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.635380 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities" (OuterVolumeSpecName: "utilities") pod "6f1e0561-a370-447d-b280-82e539671958" (UID: "6f1e0561-a370-447d-b280-82e539671958"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.644684 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms" (OuterVolumeSpecName: "kube-api-access-9stms") pod "6f1e0561-a370-447d-b280-82e539671958" (UID: "6f1e0561-a370-447d-b280-82e539671958"). InnerVolumeSpecName "kube-api-access-9stms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.726427 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f1e0561-a370-447d-b280-82e539671958" (UID: "6f1e0561-a370-447d-b280-82e539671958"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.737741 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9stms\" (UniqueName: \"kubernetes.io/projected/6f1e0561-a370-447d-b280-82e539671958-kube-api-access-9stms\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.737782 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:07 crc kubenswrapper[4902]: I1009 14:25:07.737792 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1e0561-a370-447d-b280-82e539671958-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.012127 4902 generic.go:334] "Generic (PLEG): container finished" podID="6f1e0561-a370-447d-b280-82e539671958" containerID="ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a" exitCode=0 Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.012200 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerDied","Data":"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a"} Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.012243 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h6q7f" event={"ID":"6f1e0561-a370-447d-b280-82e539671958","Type":"ContainerDied","Data":"2bfe976e6d1cd804f2a6646c5a210ec679eeafe01dcb7c9d45f0d2f115e7c0bc"} Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.012252 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h6q7f" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.012273 4902 scope.go:117] "RemoveContainer" containerID="ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.054737 4902 scope.go:117] "RemoveContainer" containerID="00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.055994 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.063376 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h6q7f"] Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.080437 4902 scope.go:117] "RemoveContainer" containerID="b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.121254 4902 scope.go:117] "RemoveContainer" containerID="ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a" Oct 09 14:25:08 crc kubenswrapper[4902]: E1009 14:25:08.121683 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a\": container with ID starting with ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a not found: ID does not exist" containerID="ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.121726 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a"} err="failed to get container status \"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a\": rpc error: code = NotFound desc = could not find container \"ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a\": container with ID starting with ee325763dddc72814b537f42ee9bfd0992758a2a5d3a430b965b1658ba76291a not found: ID does not exist" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.121748 4902 scope.go:117] "RemoveContainer" containerID="00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6" Oct 09 14:25:08 crc kubenswrapper[4902]: E1009 14:25:08.121982 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6\": container with ID starting with 00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6 not found: ID does not exist" containerID="00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.122021 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6"} err="failed to get container status \"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6\": rpc error: code = NotFound desc = could not find container \"00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6\": container with ID starting with 00a5c35934fab5d2da572f516047eb97af190d1c80b0e3ef4377801a581665f6 not found: ID does not exist" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.122038 4902 scope.go:117] "RemoveContainer" containerID="b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d" Oct 09 14:25:08 crc kubenswrapper[4902]: E1009 14:25:08.122321 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d\": container with ID starting with b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d not found: ID does not exist" containerID="b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d" Oct 09 14:25:08 crc kubenswrapper[4902]: I1009 14:25:08.122349 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d"} err="failed to get container status \"b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d\": rpc error: code = NotFound desc = could not find container \"b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d\": container with ID starting with b102564249dc29701512638327cc0187f8f6028132382ecc9e1ffc8ef4cb764d not found: ID does not exist" Oct 09 14:25:09 crc kubenswrapper[4902]: I1009 14:25:09.522582 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1e0561-a370-447d-b280-82e539671958" path="/var/lib/kubelet/pods/6f1e0561-a370-447d-b280-82e539671958/volumes" Oct 09 14:25:10 crc kubenswrapper[4902]: I1009 14:25:10.032927 4902 generic.go:334] "Generic (PLEG): container finished" podID="1c9a61d2-e7f4-4e22-8b3b-18263b72df09" containerID="205dbf10db09a0e7bbf7ee9768e53da46779af066e564411af4e9a395f9424ce" exitCode=0 Oct 09 14:25:10 crc kubenswrapper[4902]: I1009 14:25:10.032981 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" event={"ID":"1c9a61d2-e7f4-4e22-8b3b-18263b72df09","Type":"ContainerDied","Data":"205dbf10db09a0e7bbf7ee9768e53da46779af066e564411af4e9a395f9424ce"} Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.434992 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608193 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608266 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7q7\" (UniqueName: \"kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608295 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608459 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608491 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.608577 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory\") pod \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\" (UID: \"1c9a61d2-e7f4-4e22-8b3b-18263b72df09\") " Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.614961 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.615333 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7" (OuterVolumeSpecName: "kube-api-access-4f7q7") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "kube-api-access-4f7q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.637124 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory" (OuterVolumeSpecName: "inventory") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.637622 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.643642 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.649249 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1c9a61d2-e7f4-4e22-8b3b-18263b72df09" (UID: "1c9a61d2-e7f4-4e22-8b3b-18263b72df09"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711095 4902 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711129 4902 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711143 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711156 4902 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711169 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7q7\" (UniqueName: \"kubernetes.io/projected/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-kube-api-access-4f7q7\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:11 crc kubenswrapper[4902]: I1009 14:25:11.711180 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c9a61d2-e7f4-4e22-8b3b-18263b72df09-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.051271 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" event={"ID":"1c9a61d2-e7f4-4e22-8b3b-18263b72df09","Type":"ContainerDied","Data":"6ab4bd55dcef1293941f740b4eb1c265db274f45c20aa9efbec1a63260fa1d59"} Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.051320 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ab4bd55dcef1293941f740b4eb1c265db274f45c20aa9efbec1a63260fa1d59" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.051358 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.163529 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw"] Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164141 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="extract-content" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164208 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="extract-content" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164272 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164390 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164500 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="extract-content" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164566 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="extract-content" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164625 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="extract-utilities" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164675 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="extract-utilities" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164730 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164778 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164853 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="extract-utilities" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.164906 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="extract-utilities" Oct 09 14:25:12 crc kubenswrapper[4902]: E1009 14:25:12.164966 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9a61d2-e7f4-4e22-8b3b-18263b72df09" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.165019 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9a61d2-e7f4-4e22-8b3b-18263b72df09" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.165246 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="babe331a-433c-4ef7-bc32-dfec9f1f48c0" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.165322 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1e0561-a370-447d-b280-82e539671958" containerName="registry-server" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.165386 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9a61d2-e7f4-4e22-8b3b-18263b72df09" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.166068 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.168418 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.168846 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.168878 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.169019 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.169020 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.173550 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw"] Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.323389 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.323477 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.323508 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj7h\" (UniqueName: \"kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.323748 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.323875 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.426688 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.426760 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.426883 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.426945 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.426972 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj7h\" (UniqueName: \"kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.432594 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.432836 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.434601 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.438110 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.447037 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj7h\" (UniqueName: \"kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.492815 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:25:12 crc kubenswrapper[4902]: I1009 14:25:12.881351 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw"] Oct 09 14:25:13 crc kubenswrapper[4902]: I1009 14:25:13.060775 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" event={"ID":"99af5091-b31f-45c0-abcf-882b0159219f","Type":"ContainerStarted","Data":"cbf74faaad4bd5e8df9912ff6741f3730fe24ac87c98f75f1227fb7d74986218"} Oct 09 14:25:14 crc kubenswrapper[4902]: I1009 14:25:14.076375 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" event={"ID":"99af5091-b31f-45c0-abcf-882b0159219f","Type":"ContainerStarted","Data":"f978a0e9dff36201c3cfd5268941d51f918106db16ec6eed7c74d53918aa1665"} Oct 09 14:25:14 crc kubenswrapper[4902]: I1009 14:25:14.105860 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" podStartSLOduration=1.709505168 podStartE2EDuration="2.105814119s" podCreationTimestamp="2025-10-09 14:25:12 +0000 UTC" firstStartedPulling="2025-10-09 14:25:12.888571452 +0000 UTC m=+2060.086430516" lastFinishedPulling="2025-10-09 14:25:13.284880403 +0000 UTC m=+2060.482739467" observedRunningTime="2025-10-09 14:25:14.095662174 +0000 UTC m=+2061.293521258" watchObservedRunningTime="2025-10-09 14:25:14.105814119 +0000 UTC m=+2061.303673193" Oct 09 14:26:50 crc kubenswrapper[4902]: I1009 14:26:50.077867 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:26:50 crc kubenswrapper[4902]: I1009 14:26:50.078427 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:27:20 crc kubenswrapper[4902]: I1009 14:27:20.078148 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:27:20 crc kubenswrapper[4902]: I1009 14:27:20.078795 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.077935 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.080125 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.080263 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.081202 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.081403 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" gracePeriod=600 Oct 09 14:27:50 crc kubenswrapper[4902]: E1009 14:27:50.204578 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.389537 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" exitCode=0 Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.389620 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319"} Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.389660 4902 scope.go:117] "RemoveContainer" containerID="52d1d3ae5d5e216daf1363f27c9b7492293258b48764c1d6ab264dab27f82e7e" Oct 09 14:27:50 crc kubenswrapper[4902]: I1009 14:27:50.390477 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:27:50 crc kubenswrapper[4902]: E1009 14:27:50.392719 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:28:04 crc kubenswrapper[4902]: I1009 14:28:04.513148 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:28:04 crc kubenswrapper[4902]: E1009 14:28:04.515288 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:28:16 crc kubenswrapper[4902]: I1009 14:28:16.513740 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:28:16 crc kubenswrapper[4902]: E1009 14:28:16.514488 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:28:31 crc kubenswrapper[4902]: I1009 14:28:31.512720 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:28:31 crc kubenswrapper[4902]: E1009 14:28:31.513440 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:28:44 crc kubenswrapper[4902]: I1009 14:28:44.513383 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:28:44 crc kubenswrapper[4902]: E1009 14:28:44.514182 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:28:58 crc kubenswrapper[4902]: I1009 14:28:58.513279 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:28:58 crc kubenswrapper[4902]: E1009 14:28:58.514197 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:29:09 crc kubenswrapper[4902]: I1009 14:29:09.514388 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:29:09 crc kubenswrapper[4902]: E1009 14:29:09.516120 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:29:23 crc kubenswrapper[4902]: I1009 14:29:23.520485 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:29:23 crc kubenswrapper[4902]: E1009 14:29:23.521300 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:29:31 crc kubenswrapper[4902]: I1009 14:29:31.329504 4902 generic.go:334] "Generic (PLEG): container finished" podID="99af5091-b31f-45c0-abcf-882b0159219f" containerID="f978a0e9dff36201c3cfd5268941d51f918106db16ec6eed7c74d53918aa1665" exitCode=0 Oct 09 14:29:31 crc kubenswrapper[4902]: I1009 14:29:31.329641 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" event={"ID":"99af5091-b31f-45c0-abcf-882b0159219f","Type":"ContainerDied","Data":"f978a0e9dff36201c3cfd5268941d51f918106db16ec6eed7c74d53918aa1665"} Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.754752 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.866627 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj7h\" (UniqueName: \"kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h\") pod \"99af5091-b31f-45c0-abcf-882b0159219f\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.866708 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0\") pod \"99af5091-b31f-45c0-abcf-882b0159219f\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.866808 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key\") pod \"99af5091-b31f-45c0-abcf-882b0159219f\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.867092 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle\") pod \"99af5091-b31f-45c0-abcf-882b0159219f\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.867142 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory\") pod \"99af5091-b31f-45c0-abcf-882b0159219f\" (UID: \"99af5091-b31f-45c0-abcf-882b0159219f\") " Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.872361 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h" (OuterVolumeSpecName: "kube-api-access-vbj7h") pod "99af5091-b31f-45c0-abcf-882b0159219f" (UID: "99af5091-b31f-45c0-abcf-882b0159219f"). InnerVolumeSpecName "kube-api-access-vbj7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.875995 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "99af5091-b31f-45c0-abcf-882b0159219f" (UID: "99af5091-b31f-45c0-abcf-882b0159219f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.895388 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "99af5091-b31f-45c0-abcf-882b0159219f" (UID: "99af5091-b31f-45c0-abcf-882b0159219f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.896292 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory" (OuterVolumeSpecName: "inventory") pod "99af5091-b31f-45c0-abcf-882b0159219f" (UID: "99af5091-b31f-45c0-abcf-882b0159219f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.896879 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "99af5091-b31f-45c0-abcf-882b0159219f" (UID: "99af5091-b31f-45c0-abcf-882b0159219f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.970177 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj7h\" (UniqueName: \"kubernetes.io/projected/99af5091-b31f-45c0-abcf-882b0159219f-kube-api-access-vbj7h\") on node \"crc\" DevicePath \"\"" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.970244 4902 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.970257 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.970291 4902 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:29:32 crc kubenswrapper[4902]: I1009 14:29:32.970309 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99af5091-b31f-45c0-abcf-882b0159219f-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.350867 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" event={"ID":"99af5091-b31f-45c0-abcf-882b0159219f","Type":"ContainerDied","Data":"cbf74faaad4bd5e8df9912ff6741f3730fe24ac87c98f75f1227fb7d74986218"} Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.350907 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbf74faaad4bd5e8df9912ff6741f3730fe24ac87c98f75f1227fb7d74986218" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.350926 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.447376 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b"] Oct 09 14:29:33 crc kubenswrapper[4902]: E1009 14:29:33.448171 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99af5091-b31f-45c0-abcf-882b0159219f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.448196 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="99af5091-b31f-45c0-abcf-882b0159219f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.462854 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="99af5091-b31f-45c0-abcf-882b0159219f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.464075 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b"] Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.464217 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.467507 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.470841 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.471033 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.471185 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.471322 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.471458 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.471787 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583531 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nww\" (UniqueName: \"kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583620 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583648 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583698 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583718 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583737 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583756 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583788 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.583824 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.685431 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nww\" (UniqueName: \"kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.685791 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.685923 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686106 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686194 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686277 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686369 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686478 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.686730 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.687392 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.690025 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.690737 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.691521 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.691629 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.692019 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.696095 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.699266 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.707782 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nww\" (UniqueName: \"kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f9m9b\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:33 crc kubenswrapper[4902]: I1009 14:29:33.786889 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:29:34 crc kubenswrapper[4902]: I1009 14:29:34.451437 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b"] Oct 09 14:29:34 crc kubenswrapper[4902]: I1009 14:29:34.456266 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:29:35 crc kubenswrapper[4902]: I1009 14:29:35.371022 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" event={"ID":"a5fc156b-09f2-4647-a2df-73877fb9db6f","Type":"ContainerStarted","Data":"fd52fa44e0dac6b5c5ff50b707043d0b3d97dbbf58e8df9a5c07609b4a1a3c49"} Oct 09 14:29:35 crc kubenswrapper[4902]: I1009 14:29:35.372440 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" event={"ID":"a5fc156b-09f2-4647-a2df-73877fb9db6f","Type":"ContainerStarted","Data":"e3bb4c186613d6f956ea895c5e34966986e4db461ad89da29693927a18a69c73"} Oct 09 14:29:35 crc kubenswrapper[4902]: I1009 14:29:35.392003 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" podStartSLOduration=1.96752875 podStartE2EDuration="2.391982136s" podCreationTimestamp="2025-10-09 14:29:33 +0000 UTC" firstStartedPulling="2025-10-09 14:29:34.456037794 +0000 UTC m=+2321.653896858" lastFinishedPulling="2025-10-09 14:29:34.88049118 +0000 UTC m=+2322.078350244" observedRunningTime="2025-10-09 14:29:35.386688242 +0000 UTC m=+2322.584547326" watchObservedRunningTime="2025-10-09 14:29:35.391982136 +0000 UTC m=+2322.589841190" Oct 09 14:29:37 crc kubenswrapper[4902]: I1009 14:29:37.514260 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:29:37 crc kubenswrapper[4902]: E1009 14:29:37.515017 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:29:50 crc kubenswrapper[4902]: I1009 14:29:50.512760 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:29:50 crc kubenswrapper[4902]: E1009 14:29:50.513595 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.163875 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw"] Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.167361 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.169623 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.169889 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.173571 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw"] Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.308537 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2hl\" (UniqueName: \"kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.309748 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.309918 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.411891 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2hl\" (UniqueName: \"kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.412318 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.412354 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.413618 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.432290 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.435306 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2hl\" (UniqueName: \"kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl\") pod \"collect-profiles-29333670-wf4qw\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.495997 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:00 crc kubenswrapper[4902]: I1009 14:30:00.960008 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw"] Oct 09 14:30:01 crc kubenswrapper[4902]: I1009 14:30:01.622139 4902 generic.go:334] "Generic (PLEG): container finished" podID="e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" containerID="4fdae2855ea868425baf802454634becd9968e0c7284d4c37e99520bf25f91ac" exitCode=0 Oct 09 14:30:01 crc kubenswrapper[4902]: I1009 14:30:01.622246 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" event={"ID":"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7","Type":"ContainerDied","Data":"4fdae2855ea868425baf802454634becd9968e0c7284d4c37e99520bf25f91ac"} Oct 09 14:30:01 crc kubenswrapper[4902]: I1009 14:30:01.622491 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" event={"ID":"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7","Type":"ContainerStarted","Data":"f60a90f33d8aa6c9806d141afcd858c183319e251ee0e55a8dba4d8a726ab0d0"} Oct 09 14:30:02 crc kubenswrapper[4902]: I1009 14:30:02.949443 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.063686 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume\") pod \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.063769 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b2hl\" (UniqueName: \"kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl\") pod \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.063849 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume\") pod \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\" (UID: \"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7\") " Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.064721 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" (UID: "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.070870 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" (UID: "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.070882 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl" (OuterVolumeSpecName: "kube-api-access-4b2hl") pod "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" (UID: "e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7"). InnerVolumeSpecName "kube-api-access-4b2hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.166051 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.166103 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b2hl\" (UniqueName: \"kubernetes.io/projected/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-kube-api-access-4b2hl\") on node \"crc\" DevicePath \"\"" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.166118 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.640173 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" event={"ID":"e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7","Type":"ContainerDied","Data":"f60a90f33d8aa6c9806d141afcd858c183319e251ee0e55a8dba4d8a726ab0d0"} Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.640217 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60a90f33d8aa6c9806d141afcd858c183319e251ee0e55a8dba4d8a726ab0d0" Oct 09 14:30:03 crc kubenswrapper[4902]: I1009 14:30:03.640271 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333670-wf4qw" Oct 09 14:30:04 crc kubenswrapper[4902]: I1009 14:30:04.024847 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc"] Oct 09 14:30:04 crc kubenswrapper[4902]: I1009 14:30:04.033084 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333625-8hwfc"] Oct 09 14:30:05 crc kubenswrapper[4902]: I1009 14:30:05.515349 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:30:05 crc kubenswrapper[4902]: E1009 14:30:05.515785 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:30:05 crc kubenswrapper[4902]: I1009 14:30:05.534725 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472a3084-0b59-487f-b179-bfe5fa35f4a9" path="/var/lib/kubelet/pods/472a3084-0b59-487f-b179-bfe5fa35f4a9/volumes" Oct 09 14:30:17 crc kubenswrapper[4902]: I1009 14:30:17.515630 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:30:17 crc kubenswrapper[4902]: E1009 14:30:17.517332 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:30:31 crc kubenswrapper[4902]: I1009 14:30:31.513749 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:30:31 crc kubenswrapper[4902]: E1009 14:30:31.516927 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:30:46 crc kubenswrapper[4902]: I1009 14:30:46.512990 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:30:46 crc kubenswrapper[4902]: E1009 14:30:46.513800 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:30:58 crc kubenswrapper[4902]: I1009 14:30:58.513290 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:30:58 crc kubenswrapper[4902]: E1009 14:30:58.514708 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:31:04 crc kubenswrapper[4902]: I1009 14:31:04.719142 4902 scope.go:117] "RemoveContainer" containerID="435a2b57957251a2cdf6ece37fdecfb6425022ee4dc2dd66661fd282f4db9451" Oct 09 14:31:11 crc kubenswrapper[4902]: I1009 14:31:11.513092 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:31:11 crc kubenswrapper[4902]: E1009 14:31:11.513868 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:31:24 crc kubenswrapper[4902]: I1009 14:31:24.514351 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:31:24 crc kubenswrapper[4902]: E1009 14:31:24.515453 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:31:37 crc kubenswrapper[4902]: I1009 14:31:37.513064 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:31:37 crc kubenswrapper[4902]: E1009 14:31:37.513901 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:31:48 crc kubenswrapper[4902]: I1009 14:31:48.513326 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:31:48 crc kubenswrapper[4902]: E1009 14:31:48.514106 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.227760 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:31:49 crc kubenswrapper[4902]: E1009 14:31:49.228197 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" containerName="collect-profiles" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.228216 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" containerName="collect-profiles" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.229644 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f3ad51-e7dd-4fda-a7c1-807a9e326ef7" containerName="collect-profiles" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.231548 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.242544 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.329580 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98r58\" (UniqueName: \"kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.329775 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.329838 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.431922 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.432020 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.432072 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98r58\" (UniqueName: \"kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.432464 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.432575 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.453412 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98r58\" (UniqueName: \"kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58\") pod \"certified-operators-rfkm5\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:49 crc kubenswrapper[4902]: I1009 14:31:49.556032 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:50 crc kubenswrapper[4902]: I1009 14:31:50.099671 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:31:50 crc kubenswrapper[4902]: W1009 14:31:50.107439 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301c74c1_a231_4f9a_89d5_fad997dd1ded.slice/crio-6447d0f53acfbe0202683aef9f395c9f00846e41634930f38259207af3a3dae8 WatchSource:0}: Error finding container 6447d0f53acfbe0202683aef9f395c9f00846e41634930f38259207af3a3dae8: Status 404 returned error can't find the container with id 6447d0f53acfbe0202683aef9f395c9f00846e41634930f38259207af3a3dae8 Oct 09 14:31:50 crc kubenswrapper[4902]: I1009 14:31:50.527984 4902 generic.go:334] "Generic (PLEG): container finished" podID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerID="bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987" exitCode=0 Oct 09 14:31:50 crc kubenswrapper[4902]: I1009 14:31:50.528080 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerDied","Data":"bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987"} Oct 09 14:31:50 crc kubenswrapper[4902]: I1009 14:31:50.528313 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerStarted","Data":"6447d0f53acfbe0202683aef9f395c9f00846e41634930f38259207af3a3dae8"} Oct 09 14:31:52 crc kubenswrapper[4902]: I1009 14:31:52.548225 4902 generic.go:334] "Generic (PLEG): container finished" podID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerID="13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b" exitCode=0 Oct 09 14:31:52 crc kubenswrapper[4902]: I1009 14:31:52.548338 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerDied","Data":"13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b"} Oct 09 14:31:53 crc kubenswrapper[4902]: I1009 14:31:53.559074 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerStarted","Data":"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6"} Oct 09 14:31:53 crc kubenswrapper[4902]: I1009 14:31:53.581464 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rfkm5" podStartSLOduration=1.935343201 podStartE2EDuration="4.581442066s" podCreationTimestamp="2025-10-09 14:31:49 +0000 UTC" firstStartedPulling="2025-10-09 14:31:50.530147316 +0000 UTC m=+2457.728006380" lastFinishedPulling="2025-10-09 14:31:53.176246181 +0000 UTC m=+2460.374105245" observedRunningTime="2025-10-09 14:31:53.576318187 +0000 UTC m=+2460.774177261" watchObservedRunningTime="2025-10-09 14:31:53.581442066 +0000 UTC m=+2460.779301120" Oct 09 14:31:59 crc kubenswrapper[4902]: I1009 14:31:59.556934 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:59 crc kubenswrapper[4902]: I1009 14:31:59.557661 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:59 crc kubenswrapper[4902]: I1009 14:31:59.611161 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:59 crc kubenswrapper[4902]: I1009 14:31:59.666116 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:31:59 crc kubenswrapper[4902]: I1009 14:31:59.846265 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:32:01 crc kubenswrapper[4902]: I1009 14:32:01.625343 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rfkm5" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="registry-server" containerID="cri-o://6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6" gracePeriod=2 Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.059162 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.208696 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content\") pod \"301c74c1-a231-4f9a-89d5-fad997dd1ded\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.208840 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98r58\" (UniqueName: \"kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58\") pod \"301c74c1-a231-4f9a-89d5-fad997dd1ded\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.209108 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities\") pod \"301c74c1-a231-4f9a-89d5-fad997dd1ded\" (UID: \"301c74c1-a231-4f9a-89d5-fad997dd1ded\") " Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.209905 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities" (OuterVolumeSpecName: "utilities") pod "301c74c1-a231-4f9a-89d5-fad997dd1ded" (UID: "301c74c1-a231-4f9a-89d5-fad997dd1ded"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.215722 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58" (OuterVolumeSpecName: "kube-api-access-98r58") pod "301c74c1-a231-4f9a-89d5-fad997dd1ded" (UID: "301c74c1-a231-4f9a-89d5-fad997dd1ded"). InnerVolumeSpecName "kube-api-access-98r58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.251486 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "301c74c1-a231-4f9a-89d5-fad997dd1ded" (UID: "301c74c1-a231-4f9a-89d5-fad997dd1ded"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.311936 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98r58\" (UniqueName: \"kubernetes.io/projected/301c74c1-a231-4f9a-89d5-fad997dd1ded-kube-api-access-98r58\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.311978 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.311991 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/301c74c1-a231-4f9a-89d5-fad997dd1ded-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.637305 4902 generic.go:334] "Generic (PLEG): container finished" podID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerID="6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6" exitCode=0 Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.637369 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerDied","Data":"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6"} Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.637452 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfkm5" event={"ID":"301c74c1-a231-4f9a-89d5-fad997dd1ded","Type":"ContainerDied","Data":"6447d0f53acfbe0202683aef9f395c9f00846e41634930f38259207af3a3dae8"} Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.637483 4902 scope.go:117] "RemoveContainer" containerID="6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.637679 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfkm5" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.681818 4902 scope.go:117] "RemoveContainer" containerID="13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.688392 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.696741 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rfkm5"] Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.707930 4902 scope.go:117] "RemoveContainer" containerID="bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.747114 4902 scope.go:117] "RemoveContainer" containerID="6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6" Oct 09 14:32:02 crc kubenswrapper[4902]: E1009 14:32:02.747740 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6\": container with ID starting with 6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6 not found: ID does not exist" containerID="6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.747779 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6"} err="failed to get container status \"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6\": rpc error: code = NotFound desc = could not find container \"6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6\": container with ID starting with 6106aa94a404b6a56345a02dada99d10604ece402a23d729ed21a1e3d6b28af6 not found: ID does not exist" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.747808 4902 scope.go:117] "RemoveContainer" containerID="13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b" Oct 09 14:32:02 crc kubenswrapper[4902]: E1009 14:32:02.748498 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b\": container with ID starting with 13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b not found: ID does not exist" containerID="13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.748551 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b"} err="failed to get container status \"13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b\": rpc error: code = NotFound desc = could not find container \"13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b\": container with ID starting with 13a021b7beea382d52302eaafa088a61169f63db9c35c43981aec78494c8a58b not found: ID does not exist" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.748583 4902 scope.go:117] "RemoveContainer" containerID="bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987" Oct 09 14:32:02 crc kubenswrapper[4902]: E1009 14:32:02.749203 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987\": container with ID starting with bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987 not found: ID does not exist" containerID="bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987" Oct 09 14:32:02 crc kubenswrapper[4902]: I1009 14:32:02.749354 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987"} err="failed to get container status \"bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987\": rpc error: code = NotFound desc = could not find container \"bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987\": container with ID starting with bc8649eb0b288abae6db43b7fbe8d2f7eb1d34475c5fa8fdd1d6990808b91987 not found: ID does not exist" Oct 09 14:32:03 crc kubenswrapper[4902]: I1009 14:32:03.529823 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:32:03 crc kubenswrapper[4902]: E1009 14:32:03.530764 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:32:03 crc kubenswrapper[4902]: I1009 14:32:03.544569 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" path="/var/lib/kubelet/pods/301c74c1-a231-4f9a-89d5-fad997dd1ded/volumes" Oct 09 14:32:16 crc kubenswrapper[4902]: I1009 14:32:16.513721 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:32:16 crc kubenswrapper[4902]: E1009 14:32:16.514567 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:32:28 crc kubenswrapper[4902]: I1009 14:32:28.513680 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:32:28 crc kubenswrapper[4902]: E1009 14:32:28.514568 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:32:39 crc kubenswrapper[4902]: I1009 14:32:39.513492 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:32:39 crc kubenswrapper[4902]: E1009 14:32:39.514188 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:32:43 crc kubenswrapper[4902]: I1009 14:32:43.013193 4902 generic.go:334] "Generic (PLEG): container finished" podID="a5fc156b-09f2-4647-a2df-73877fb9db6f" containerID="fd52fa44e0dac6b5c5ff50b707043d0b3d97dbbf58e8df9a5c07609b4a1a3c49" exitCode=0 Oct 09 14:32:43 crc kubenswrapper[4902]: I1009 14:32:43.013312 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" event={"ID":"a5fc156b-09f2-4647-a2df-73877fb9db6f","Type":"ContainerDied","Data":"fd52fa44e0dac6b5c5ff50b707043d0b3d97dbbf58e8df9a5c07609b4a1a3c49"} Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.431959 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.616523 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.616576 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.616599 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.616623 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79nww\" (UniqueName: \"kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.617356 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.617591 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.617642 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.617752 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.617798 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0\") pod \"a5fc156b-09f2-4647-a2df-73877fb9db6f\" (UID: \"a5fc156b-09f2-4647-a2df-73877fb9db6f\") " Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.622643 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.622704 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww" (OuterVolumeSpecName: "kube-api-access-79nww") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "kube-api-access-79nww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.642360 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.644622 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.649098 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.654031 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.659312 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.659985 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.662618 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory" (OuterVolumeSpecName: "inventory") pod "a5fc156b-09f2-4647-a2df-73877fb9db6f" (UID: "a5fc156b-09f2-4647-a2df-73877fb9db6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720331 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79nww\" (UniqueName: \"kubernetes.io/projected/a5fc156b-09f2-4647-a2df-73877fb9db6f-kube-api-access-79nww\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720368 4902 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720379 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720389 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720398 4902 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720444 4902 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720458 4902 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720470 4902 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:44 crc kubenswrapper[4902]: I1009 14:32:44.720485 4902 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a5fc156b-09f2-4647-a2df-73877fb9db6f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.032565 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" event={"ID":"a5fc156b-09f2-4647-a2df-73877fb9db6f","Type":"ContainerDied","Data":"e3bb4c186613d6f956ea895c5e34966986e4db461ad89da29693927a18a69c73"} Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.032613 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3bb4c186613d6f956ea895c5e34966986e4db461ad89da29693927a18a69c73" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.032619 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f9m9b" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.138731 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm"] Oct 09 14:32:45 crc kubenswrapper[4902]: E1009 14:32:45.139204 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fc156b-09f2-4647-a2df-73877fb9db6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139222 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fc156b-09f2-4647-a2df-73877fb9db6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 14:32:45 crc kubenswrapper[4902]: E1009 14:32:45.139252 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="extract-utilities" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139261 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="extract-utilities" Oct 09 14:32:45 crc kubenswrapper[4902]: E1009 14:32:45.139280 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="registry-server" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139289 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="registry-server" Oct 09 14:32:45 crc kubenswrapper[4902]: E1009 14:32:45.139304 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="extract-content" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139310 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="extract-content" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139521 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fc156b-09f2-4647-a2df-73877fb9db6f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.139555 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="301c74c1-a231-4f9a-89d5-fad997dd1ded" containerName="registry-server" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.140268 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.144216 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kjff2" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.145041 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.145100 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.145295 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.145462 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.152750 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm"] Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.328130 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.328206 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.328980 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.329045 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.329086 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.329107 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8bl6\" (UniqueName: \"kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.329331 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431243 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431333 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431386 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431442 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431475 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8bl6\" (UniqueName: \"kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431550 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.431599 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.435633 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.435900 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.436705 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.437152 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.437505 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.439215 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.449162 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8bl6\" (UniqueName: \"kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.469087 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:32:45 crc kubenswrapper[4902]: I1009 14:32:45.989736 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm"] Oct 09 14:32:46 crc kubenswrapper[4902]: I1009 14:32:46.043753 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" event={"ID":"40011150-b1be-4ddc-8ecf-b70c54c98b9c","Type":"ContainerStarted","Data":"39fa5eb8e81536e70f1b96477e2d6346bc41b801fd380c2844feb70486e893a2"} Oct 09 14:32:47 crc kubenswrapper[4902]: I1009 14:32:47.052938 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" event={"ID":"40011150-b1be-4ddc-8ecf-b70c54c98b9c","Type":"ContainerStarted","Data":"0b3dcdbf309f80602b50dab6baa625cf227fdeafd16f85b199d9378b24fe4326"} Oct 09 14:32:47 crc kubenswrapper[4902]: I1009 14:32:47.077741 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" podStartSLOduration=1.495814883 podStartE2EDuration="2.077723316s" podCreationTimestamp="2025-10-09 14:32:45 +0000 UTC" firstStartedPulling="2025-10-09 14:32:45.996214284 +0000 UTC m=+2513.194073348" lastFinishedPulling="2025-10-09 14:32:46.578122717 +0000 UTC m=+2513.775981781" observedRunningTime="2025-10-09 14:32:47.07270782 +0000 UTC m=+2514.270566904" watchObservedRunningTime="2025-10-09 14:32:47.077723316 +0000 UTC m=+2514.275582380" Oct 09 14:32:51 crc kubenswrapper[4902]: I1009 14:32:51.513822 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:32:52 crc kubenswrapper[4902]: I1009 14:32:52.100244 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95"} Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.207882 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.211387 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.223159 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.253959 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.254383 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st62r\" (UniqueName: \"kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.254444 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.356677 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.356766 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st62r\" (UniqueName: \"kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.356791 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.357223 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.357342 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.383131 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st62r\" (UniqueName: \"kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r\") pod \"community-operators-rzfzb\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:16 crc kubenswrapper[4902]: I1009 14:33:16.548764 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:17 crc kubenswrapper[4902]: I1009 14:33:17.096633 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:17 crc kubenswrapper[4902]: I1009 14:33:17.317905 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerStarted","Data":"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182"} Oct 09 14:33:17 crc kubenswrapper[4902]: I1009 14:33:17.318187 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerStarted","Data":"b3dc6d0384523488936e4ab443f6ad3cf370a429241489c858fe55dfe1294f01"} Oct 09 14:33:18 crc kubenswrapper[4902]: I1009 14:33:18.330849 4902 generic.go:334] "Generic (PLEG): container finished" podID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerID="452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182" exitCode=0 Oct 09 14:33:18 crc kubenswrapper[4902]: I1009 14:33:18.330917 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerDied","Data":"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182"} Oct 09 14:33:18 crc kubenswrapper[4902]: I1009 14:33:18.331179 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerStarted","Data":"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed"} Oct 09 14:33:19 crc kubenswrapper[4902]: I1009 14:33:19.342270 4902 generic.go:334] "Generic (PLEG): container finished" podID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerID="6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed" exitCode=0 Oct 09 14:33:19 crc kubenswrapper[4902]: I1009 14:33:19.342353 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerDied","Data":"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed"} Oct 09 14:33:20 crc kubenswrapper[4902]: I1009 14:33:20.355774 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerStarted","Data":"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e"} Oct 09 14:33:20 crc kubenswrapper[4902]: I1009 14:33:20.379898 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzfzb" podStartSLOduration=1.916466612 podStartE2EDuration="4.379874988s" podCreationTimestamp="2025-10-09 14:33:16 +0000 UTC" firstStartedPulling="2025-10-09 14:33:17.320265331 +0000 UTC m=+2544.518124395" lastFinishedPulling="2025-10-09 14:33:19.783673707 +0000 UTC m=+2546.981532771" observedRunningTime="2025-10-09 14:33:20.375168341 +0000 UTC m=+2547.573027415" watchObservedRunningTime="2025-10-09 14:33:20.379874988 +0000 UTC m=+2547.577734062" Oct 09 14:33:26 crc kubenswrapper[4902]: I1009 14:33:26.549184 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:26 crc kubenswrapper[4902]: I1009 14:33:26.549522 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:26 crc kubenswrapper[4902]: I1009 14:33:26.593589 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:27 crc kubenswrapper[4902]: I1009 14:33:27.469812 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:27 crc kubenswrapper[4902]: I1009 14:33:27.526633 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:29 crc kubenswrapper[4902]: I1009 14:33:29.432480 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzfzb" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="registry-server" containerID="cri-o://aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e" gracePeriod=2 Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.382572 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.422039 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content\") pod \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.422197 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities\") pod \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.422306 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st62r\" (UniqueName: \"kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r\") pod \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\" (UID: \"5f20742a-a99e-4616-a086-bd8e0f3a2e6e\") " Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.423676 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities" (OuterVolumeSpecName: "utilities") pod "5f20742a-a99e-4616-a086-bd8e0f3a2e6e" (UID: "5f20742a-a99e-4616-a086-bd8e0f3a2e6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.428269 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r" (OuterVolumeSpecName: "kube-api-access-st62r") pod "5f20742a-a99e-4616-a086-bd8e0f3a2e6e" (UID: "5f20742a-a99e-4616-a086-bd8e0f3a2e6e"). InnerVolumeSpecName "kube-api-access-st62r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.444852 4902 generic.go:334] "Generic (PLEG): container finished" podID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerID="aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e" exitCode=0 Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.444923 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzfzb" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.444946 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerDied","Data":"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e"} Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.446965 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzfzb" event={"ID":"5f20742a-a99e-4616-a086-bd8e0f3a2e6e","Type":"ContainerDied","Data":"b3dc6d0384523488936e4ab443f6ad3cf370a429241489c858fe55dfe1294f01"} Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.447078 4902 scope.go:117] "RemoveContainer" containerID="aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.477219 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f20742a-a99e-4616-a086-bd8e0f3a2e6e" (UID: "5f20742a-a99e-4616-a086-bd8e0f3a2e6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.495316 4902 scope.go:117] "RemoveContainer" containerID="6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.524916 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.524951 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st62r\" (UniqueName: \"kubernetes.io/projected/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-kube-api-access-st62r\") on node \"crc\" DevicePath \"\"" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.524966 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f20742a-a99e-4616-a086-bd8e0f3a2e6e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.527962 4902 scope.go:117] "RemoveContainer" containerID="452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.565448 4902 scope.go:117] "RemoveContainer" containerID="aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e" Oct 09 14:33:30 crc kubenswrapper[4902]: E1009 14:33:30.567578 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e\": container with ID starting with aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e not found: ID does not exist" containerID="aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.567621 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e"} err="failed to get container status \"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e\": rpc error: code = NotFound desc = could not find container \"aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e\": container with ID starting with aa783968c4b099e2451e131c1677d85c3ae7a24204382bc09be32f981943f06e not found: ID does not exist" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.567650 4902 scope.go:117] "RemoveContainer" containerID="6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed" Oct 09 14:33:30 crc kubenswrapper[4902]: E1009 14:33:30.570144 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed\": container with ID starting with 6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed not found: ID does not exist" containerID="6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.570195 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed"} err="failed to get container status \"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed\": rpc error: code = NotFound desc = could not find container \"6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed\": container with ID starting with 6e64496e90aee7eac64eccdbee0151d8d07985b582ec6d1e0c7ade0ff85d8bed not found: ID does not exist" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.570227 4902 scope.go:117] "RemoveContainer" containerID="452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182" Oct 09 14:33:30 crc kubenswrapper[4902]: E1009 14:33:30.571046 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182\": container with ID starting with 452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182 not found: ID does not exist" containerID="452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.571135 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182"} err="failed to get container status \"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182\": rpc error: code = NotFound desc = could not find container \"452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182\": container with ID starting with 452efba5873d0993efdebce79c6877fe3399ae9d95cd3886882b1ba368243182 not found: ID does not exist" Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.779357 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:30 crc kubenswrapper[4902]: I1009 14:33:30.787505 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzfzb"] Oct 09 14:33:31 crc kubenswrapper[4902]: I1009 14:33:31.524010 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" path="/var/lib/kubelet/pods/5f20742a-a99e-4616-a086-bd8e0f3a2e6e/volumes" Oct 09 14:35:07 crc kubenswrapper[4902]: I1009 14:35:07.271621 4902 generic.go:334] "Generic (PLEG): container finished" podID="40011150-b1be-4ddc-8ecf-b70c54c98b9c" containerID="0b3dcdbf309f80602b50dab6baa625cf227fdeafd16f85b199d9378b24fe4326" exitCode=0 Oct 09 14:35:07 crc kubenswrapper[4902]: I1009 14:35:07.271732 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" event={"ID":"40011150-b1be-4ddc-8ecf-b70c54c98b9c","Type":"ContainerDied","Data":"0b3dcdbf309f80602b50dab6baa625cf227fdeafd16f85b199d9378b24fe4326"} Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.684330 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800282 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8bl6\" (UniqueName: \"kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800355 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800550 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800578 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800604 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800704 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.800746 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1\") pod \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\" (UID: \"40011150-b1be-4ddc-8ecf-b70c54c98b9c\") " Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.807859 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.807937 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6" (OuterVolumeSpecName: "kube-api-access-l8bl6") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "kube-api-access-l8bl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.832370 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.834519 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory" (OuterVolumeSpecName: "inventory") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.835767 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.840393 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.845676 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "40011150-b1be-4ddc-8ecf-b70c54c98b9c" (UID: "40011150-b1be-4ddc-8ecf-b70c54c98b9c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903234 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8bl6\" (UniqueName: \"kubernetes.io/projected/40011150-b1be-4ddc-8ecf-b70c54c98b9c-kube-api-access-l8bl6\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903621 4902 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903635 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903646 4902 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903656 4902 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903668 4902 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:08 crc kubenswrapper[4902]: I1009 14:35:08.903680 4902 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/40011150-b1be-4ddc-8ecf-b70c54c98b9c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 09 14:35:09 crc kubenswrapper[4902]: I1009 14:35:09.290955 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" event={"ID":"40011150-b1be-4ddc-8ecf-b70c54c98b9c","Type":"ContainerDied","Data":"39fa5eb8e81536e70f1b96477e2d6346bc41b801fd380c2844feb70486e893a2"} Oct 09 14:35:09 crc kubenswrapper[4902]: I1009 14:35:09.291004 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39fa5eb8e81536e70f1b96477e2d6346bc41b801fd380c2844feb70486e893a2" Oct 09 14:35:09 crc kubenswrapper[4902]: I1009 14:35:09.291005 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm" Oct 09 14:35:20 crc kubenswrapper[4902]: I1009 14:35:20.078668 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:35:20 crc kubenswrapper[4902]: I1009 14:35:20.079168 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:35:50 crc kubenswrapper[4902]: I1009 14:35:50.078588 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:35:50 crc kubenswrapper[4902]: I1009 14:35:50.079101 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.984568 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 14:35:53 crc kubenswrapper[4902]: E1009 14:35:53.985505 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="extract-content" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985521 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="extract-content" Oct 09 14:35:53 crc kubenswrapper[4902]: E1009 14:35:53.985566 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="extract-utilities" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985574 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="extract-utilities" Oct 09 14:35:53 crc kubenswrapper[4902]: E1009 14:35:53.985590 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="registry-server" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985598 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="registry-server" Oct 09 14:35:53 crc kubenswrapper[4902]: E1009 14:35:53.985625 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40011150-b1be-4ddc-8ecf-b70c54c98b9c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985634 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="40011150-b1be-4ddc-8ecf-b70c54c98b9c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985846 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="40011150-b1be-4ddc-8ecf-b70c54c98b9c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.985876 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f20742a-a99e-4616-a086-bd8e0f3a2e6e" containerName="registry-server" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.986736 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.988989 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xqzs5" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.989513 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.989520 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.990232 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 14:35:53 crc kubenswrapper[4902]: I1009 14:35:53.994730 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.153425 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.153844 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154203 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154315 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154484 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154541 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154570 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298cw\" (UniqueName: \"kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154755 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.154810 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.256861 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.256954 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.256976 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.256993 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298cw\" (UniqueName: \"kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257041 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257065 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257101 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257126 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257169 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.257519 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.258249 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.258311 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.258632 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.259200 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.264051 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.264816 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.265245 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.275875 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298cw\" (UniqueName: \"kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.287091 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.309972 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.734011 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 14:35:54 crc kubenswrapper[4902]: I1009 14:35:54.743941 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:35:55 crc kubenswrapper[4902]: I1009 14:35:55.662830 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee0ede17-c9e7-40c7-b2da-ac04b4df9010","Type":"ContainerStarted","Data":"b06ddfc8c82bb0d78607fa316e5be6fb39f6179e746d000f1888c5d4da1ff49e"} Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.617852 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.624105 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.635604 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.755716 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.755795 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.755990 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wkj\" (UniqueName: \"kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.858702 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.858856 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.858943 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wkj\" (UniqueName: \"kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.859522 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.860838 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.880413 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wkj\" (UniqueName: \"kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj\") pod \"redhat-operators-jspvd\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:03 crc kubenswrapper[4902]: I1009 14:36:03.960162 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:18 crc kubenswrapper[4902]: E1009 14:36:18.844654 4902 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 09 14:36:18 crc kubenswrapper[4902]: E1009 14:36:18.845364 4902 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-298cw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ee0ede17-c9e7-40c7-b2da-ac04b4df9010): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 14:36:18 crc kubenswrapper[4902]: E1009 14:36:18.848623 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" Oct 09 14:36:18 crc kubenswrapper[4902]: E1009 14:36:18.865316 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" Oct 09 14:36:19 crc kubenswrapper[4902]: I1009 14:36:19.208264 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:19 crc kubenswrapper[4902]: I1009 14:36:19.872202 4902 generic.go:334] "Generic (PLEG): container finished" podID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerID="adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69" exitCode=0 Oct 09 14:36:19 crc kubenswrapper[4902]: I1009 14:36:19.872317 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerDied","Data":"adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69"} Oct 09 14:36:19 crc kubenswrapper[4902]: I1009 14:36:19.872618 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerStarted","Data":"6cd69aff6893f26c1a905289e9753a52ea4d81149f341c9c7826ea6d86abb17d"} Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.078639 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.078698 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.078750 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.079523 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.079583 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95" gracePeriod=600 Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.884454 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95" exitCode=0 Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.884530 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95"} Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.885042 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570"} Oct 09 14:36:20 crc kubenswrapper[4902]: I1009 14:36:20.885070 4902 scope.go:117] "RemoveContainer" containerID="a801328970550531ee1b5bd6e247b502db3c5fb79b1bd36128afb401a712b319" Oct 09 14:36:21 crc kubenswrapper[4902]: I1009 14:36:21.896143 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerStarted","Data":"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc"} Oct 09 14:36:23 crc kubenswrapper[4902]: I1009 14:36:23.916660 4902 generic.go:334] "Generic (PLEG): container finished" podID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerID="64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc" exitCode=0 Oct 09 14:36:23 crc kubenswrapper[4902]: I1009 14:36:23.916711 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerDied","Data":"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc"} Oct 09 14:36:27 crc kubenswrapper[4902]: I1009 14:36:27.952623 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerStarted","Data":"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566"} Oct 09 14:36:27 crc kubenswrapper[4902]: I1009 14:36:27.974885 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jspvd" podStartSLOduration=17.216060327 podStartE2EDuration="24.974865459s" podCreationTimestamp="2025-10-09 14:36:03 +0000 UTC" firstStartedPulling="2025-10-09 14:36:19.873955155 +0000 UTC m=+2727.071814219" lastFinishedPulling="2025-10-09 14:36:27.632760287 +0000 UTC m=+2734.830619351" observedRunningTime="2025-10-09 14:36:27.969981077 +0000 UTC m=+2735.167840161" watchObservedRunningTime="2025-10-09 14:36:27.974865459 +0000 UTC m=+2735.172724523" Oct 09 14:36:33 crc kubenswrapper[4902]: I1009 14:36:33.961165 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:33 crc kubenswrapper[4902]: I1009 14:36:33.961864 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:34 crc kubenswrapper[4902]: I1009 14:36:34.012986 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:34 crc kubenswrapper[4902]: I1009 14:36:34.064104 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:34 crc kubenswrapper[4902]: I1009 14:36:34.811225 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:35 crc kubenswrapper[4902]: I1009 14:36:35.016419 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee0ede17-c9e7-40c7-b2da-ac04b4df9010","Type":"ContainerStarted","Data":"0c95617a0869641f0fc4d0a135d29842ae4868039be79703cd8b5ad72266dbd1"} Oct 09 14:36:35 crc kubenswrapper[4902]: I1009 14:36:35.039479 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.644179636 podStartE2EDuration="43.039461366s" podCreationTimestamp="2025-10-09 14:35:52 +0000 UTC" firstStartedPulling="2025-10-09 14:35:54.743692053 +0000 UTC m=+2701.941551117" lastFinishedPulling="2025-10-09 14:36:34.138973783 +0000 UTC m=+2741.336832847" observedRunningTime="2025-10-09 14:36:35.03307178 +0000 UTC m=+2742.230930874" watchObservedRunningTime="2025-10-09 14:36:35.039461366 +0000 UTC m=+2742.237320430" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.025363 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jspvd" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="registry-server" containerID="cri-o://677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566" gracePeriod=2 Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.502616 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.627182 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities\") pod \"cec993c2-9894-43f1-9ffe-4f7a97937bed\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.627854 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content\") pod \"cec993c2-9894-43f1-9ffe-4f7a97937bed\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.627966 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2wkj\" (UniqueName: \"kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj\") pod \"cec993c2-9894-43f1-9ffe-4f7a97937bed\" (UID: \"cec993c2-9894-43f1-9ffe-4f7a97937bed\") " Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.629120 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities" (OuterVolumeSpecName: "utilities") pod "cec993c2-9894-43f1-9ffe-4f7a97937bed" (UID: "cec993c2-9894-43f1-9ffe-4f7a97937bed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.635679 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj" (OuterVolumeSpecName: "kube-api-access-k2wkj") pod "cec993c2-9894-43f1-9ffe-4f7a97937bed" (UID: "cec993c2-9894-43f1-9ffe-4f7a97937bed"). InnerVolumeSpecName "kube-api-access-k2wkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.709912 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cec993c2-9894-43f1-9ffe-4f7a97937bed" (UID: "cec993c2-9894-43f1-9ffe-4f7a97937bed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.731115 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.731151 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2wkj\" (UniqueName: \"kubernetes.io/projected/cec993c2-9894-43f1-9ffe-4f7a97937bed-kube-api-access-k2wkj\") on node \"crc\" DevicePath \"\"" Oct 09 14:36:36 crc kubenswrapper[4902]: I1009 14:36:36.731164 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec993c2-9894-43f1-9ffe-4f7a97937bed-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.037737 4902 generic.go:334] "Generic (PLEG): container finished" podID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerID="677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566" exitCode=0 Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.037787 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerDied","Data":"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566"} Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.037807 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jspvd" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.037824 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jspvd" event={"ID":"cec993c2-9894-43f1-9ffe-4f7a97937bed","Type":"ContainerDied","Data":"6cd69aff6893f26c1a905289e9753a52ea4d81149f341c9c7826ea6d86abb17d"} Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.037855 4902 scope.go:117] "RemoveContainer" containerID="677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.059372 4902 scope.go:117] "RemoveContainer" containerID="64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.073214 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.081743 4902 scope.go:117] "RemoveContainer" containerID="adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.084252 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jspvd"] Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.138046 4902 scope.go:117] "RemoveContainer" containerID="677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566" Oct 09 14:36:37 crc kubenswrapper[4902]: E1009 14:36:37.139747 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566\": container with ID starting with 677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566 not found: ID does not exist" containerID="677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.139858 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566"} err="failed to get container status \"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566\": rpc error: code = NotFound desc = could not find container \"677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566\": container with ID starting with 677badd94e02cb3d1d6994ff784d5b406ce4a9ece2ffcb1b51f7c3bc7e530566 not found: ID does not exist" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.139946 4902 scope.go:117] "RemoveContainer" containerID="64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc" Oct 09 14:36:37 crc kubenswrapper[4902]: E1009 14:36:37.145905 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc\": container with ID starting with 64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc not found: ID does not exist" containerID="64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.145935 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc"} err="failed to get container status \"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc\": rpc error: code = NotFound desc = could not find container \"64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc\": container with ID starting with 64f0e3d752b066fb939dc8e10a6a1915f87615c0b5e2e38bfb304c70b25f6dfc not found: ID does not exist" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.145960 4902 scope.go:117] "RemoveContainer" containerID="adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69" Oct 09 14:36:37 crc kubenswrapper[4902]: E1009 14:36:37.146276 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69\": container with ID starting with adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69 not found: ID does not exist" containerID="adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.146295 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69"} err="failed to get container status \"adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69\": rpc error: code = NotFound desc = could not find container \"adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69\": container with ID starting with adb104259a151770210b204090e015678667f8a0acdf524f33f53aaa27203c69 not found: ID does not exist" Oct 09 14:36:37 crc kubenswrapper[4902]: I1009 14:36:37.525502 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" path="/var/lib/kubelet/pods/cec993c2-9894-43f1-9ffe-4f7a97937bed/volumes" Oct 09 14:38:20 crc kubenswrapper[4902]: I1009 14:38:20.078291 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:38:20 crc kubenswrapper[4902]: I1009 14:38:20.079546 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.015772 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:38 crc kubenswrapper[4902]: E1009 14:38:38.016536 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="registry-server" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.016547 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="registry-server" Oct 09 14:38:38 crc kubenswrapper[4902]: E1009 14:38:38.016573 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="extract-utilities" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.016580 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="extract-utilities" Oct 09 14:38:38 crc kubenswrapper[4902]: E1009 14:38:38.016598 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="extract-content" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.016604 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="extract-content" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.016781 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec993c2-9894-43f1-9ffe-4f7a97937bed" containerName="registry-server" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.018362 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.032319 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.200304 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.200605 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxjj6\" (UniqueName: \"kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.200919 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.302389 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.302493 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxjj6\" (UniqueName: \"kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.302577 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.302877 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.302947 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.324185 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxjj6\" (UniqueName: \"kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6\") pod \"redhat-marketplace-hfgx2\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.337601 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:38 crc kubenswrapper[4902]: I1009 14:38:38.796427 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:38 crc kubenswrapper[4902]: W1009 14:38:38.800976 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b7b3c1_27a1_4fa2_ad98_05d2904d292c.slice/crio-8a44650224326bfffd9ea8746fea86f9cca4c81b211e5d8094282be63a18e8d3 WatchSource:0}: Error finding container 8a44650224326bfffd9ea8746fea86f9cca4c81b211e5d8094282be63a18e8d3: Status 404 returned error can't find the container with id 8a44650224326bfffd9ea8746fea86f9cca4c81b211e5d8094282be63a18e8d3 Oct 09 14:38:39 crc kubenswrapper[4902]: I1009 14:38:39.155919 4902 generic.go:334] "Generic (PLEG): container finished" podID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerID="f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8" exitCode=0 Oct 09 14:38:39 crc kubenswrapper[4902]: I1009 14:38:39.156123 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerDied","Data":"f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8"} Oct 09 14:38:39 crc kubenswrapper[4902]: I1009 14:38:39.156257 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerStarted","Data":"8a44650224326bfffd9ea8746fea86f9cca4c81b211e5d8094282be63a18e8d3"} Oct 09 14:38:41 crc kubenswrapper[4902]: I1009 14:38:41.174486 4902 generic.go:334] "Generic (PLEG): container finished" podID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerID="d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2" exitCode=0 Oct 09 14:38:41 crc kubenswrapper[4902]: I1009 14:38:41.174544 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerDied","Data":"d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2"} Oct 09 14:38:42 crc kubenswrapper[4902]: I1009 14:38:42.187285 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerStarted","Data":"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f"} Oct 09 14:38:42 crc kubenswrapper[4902]: I1009 14:38:42.210962 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfgx2" podStartSLOduration=2.571752113 podStartE2EDuration="5.210944126s" podCreationTimestamp="2025-10-09 14:38:37 +0000 UTC" firstStartedPulling="2025-10-09 14:38:39.158603175 +0000 UTC m=+2866.356462229" lastFinishedPulling="2025-10-09 14:38:41.797795148 +0000 UTC m=+2868.995654242" observedRunningTime="2025-10-09 14:38:42.208262739 +0000 UTC m=+2869.406121803" watchObservedRunningTime="2025-10-09 14:38:42.210944126 +0000 UTC m=+2869.408803190" Oct 09 14:38:48 crc kubenswrapper[4902]: I1009 14:38:48.338279 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:48 crc kubenswrapper[4902]: I1009 14:38:48.338918 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:48 crc kubenswrapper[4902]: I1009 14:38:48.390046 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:49 crc kubenswrapper[4902]: I1009 14:38:49.295208 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:49 crc kubenswrapper[4902]: I1009 14:38:49.342193 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:50 crc kubenswrapper[4902]: I1009 14:38:50.078472 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:38:50 crc kubenswrapper[4902]: I1009 14:38:50.078854 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.264223 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hfgx2" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="registry-server" containerID="cri-o://96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f" gracePeriod=2 Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.748480 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.849243 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxjj6\" (UniqueName: \"kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6\") pod \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.849390 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content\") pod \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.849539 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities\") pod \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\" (UID: \"36b7b3c1-27a1-4fa2-ad98-05d2904d292c\") " Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.850461 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities" (OuterVolumeSpecName: "utilities") pod "36b7b3c1-27a1-4fa2-ad98-05d2904d292c" (UID: "36b7b3c1-27a1-4fa2-ad98-05d2904d292c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.854657 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6" (OuterVolumeSpecName: "kube-api-access-qxjj6") pod "36b7b3c1-27a1-4fa2-ad98-05d2904d292c" (UID: "36b7b3c1-27a1-4fa2-ad98-05d2904d292c"). InnerVolumeSpecName "kube-api-access-qxjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.863870 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36b7b3c1-27a1-4fa2-ad98-05d2904d292c" (UID: "36b7b3c1-27a1-4fa2-ad98-05d2904d292c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.951464 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.951495 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:38:51 crc kubenswrapper[4902]: I1009 14:38:51.951505 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxjj6\" (UniqueName: \"kubernetes.io/projected/36b7b3c1-27a1-4fa2-ad98-05d2904d292c-kube-api-access-qxjj6\") on node \"crc\" DevicePath \"\"" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.275358 4902 generic.go:334] "Generic (PLEG): container finished" podID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerID="96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f" exitCode=0 Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.275450 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerDied","Data":"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f"} Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.275685 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfgx2" event={"ID":"36b7b3c1-27a1-4fa2-ad98-05d2904d292c","Type":"ContainerDied","Data":"8a44650224326bfffd9ea8746fea86f9cca4c81b211e5d8094282be63a18e8d3"} Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.275708 4902 scope.go:117] "RemoveContainer" containerID="96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.275510 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfgx2" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.307700 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.314959 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfgx2"] Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.316464 4902 scope.go:117] "RemoveContainer" containerID="d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.342105 4902 scope.go:117] "RemoveContainer" containerID="f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.394193 4902 scope.go:117] "RemoveContainer" containerID="96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f" Oct 09 14:38:52 crc kubenswrapper[4902]: E1009 14:38:52.394687 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f\": container with ID starting with 96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f not found: ID does not exist" containerID="96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.394731 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f"} err="failed to get container status \"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f\": rpc error: code = NotFound desc = could not find container \"96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f\": container with ID starting with 96df25ea075cb3516404ecaeb1033b902f41fcc913308badf4e595c3caf9320f not found: ID does not exist" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.394781 4902 scope.go:117] "RemoveContainer" containerID="d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2" Oct 09 14:38:52 crc kubenswrapper[4902]: E1009 14:38:52.395102 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2\": container with ID starting with d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2 not found: ID does not exist" containerID="d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.395153 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2"} err="failed to get container status \"d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2\": rpc error: code = NotFound desc = could not find container \"d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2\": container with ID starting with d29068f9489e5c84087ee30dd09b7d74cd051354fed4abfdaa2ebb3941cc89d2 not found: ID does not exist" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.395197 4902 scope.go:117] "RemoveContainer" containerID="f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8" Oct 09 14:38:52 crc kubenswrapper[4902]: E1009 14:38:52.395537 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8\": container with ID starting with f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8 not found: ID does not exist" containerID="f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8" Oct 09 14:38:52 crc kubenswrapper[4902]: I1009 14:38:52.395593 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8"} err="failed to get container status \"f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8\": rpc error: code = NotFound desc = could not find container \"f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8\": container with ID starting with f5d8ee361a0afda8b9809ecd633992117eca9d9baeb386a78ebe9b2ee26446e8 not found: ID does not exist" Oct 09 14:38:53 crc kubenswrapper[4902]: I1009 14:38:53.526839 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" path="/var/lib/kubelet/pods/36b7b3c1-27a1-4fa2-ad98-05d2904d292c/volumes" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.078145 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.078623 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.078670 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.079419 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.079545 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" gracePeriod=600 Oct 09 14:39:20 crc kubenswrapper[4902]: E1009 14:39:20.201556 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.501432 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" exitCode=0 Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.501473 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570"} Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.501538 4902 scope.go:117] "RemoveContainer" containerID="f68eeb81f586235c2bb7944dfe62f7c5cf0b02ba4bf203b580a9891306593d95" Oct 09 14:39:20 crc kubenswrapper[4902]: I1009 14:39:20.502183 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:39:20 crc kubenswrapper[4902]: E1009 14:39:20.502501 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:39:33 crc kubenswrapper[4902]: I1009 14:39:33.521959 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:39:33 crc kubenswrapper[4902]: E1009 14:39:33.522752 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:39:47 crc kubenswrapper[4902]: I1009 14:39:47.513872 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:39:47 crc kubenswrapper[4902]: E1009 14:39:47.514863 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:40:02 crc kubenswrapper[4902]: I1009 14:40:02.513001 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:40:02 crc kubenswrapper[4902]: E1009 14:40:02.513830 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:40:17 crc kubenswrapper[4902]: I1009 14:40:17.513629 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:40:17 crc kubenswrapper[4902]: E1009 14:40:17.514217 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:40:32 crc kubenswrapper[4902]: I1009 14:40:32.513885 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:40:32 crc kubenswrapper[4902]: E1009 14:40:32.514628 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:40:46 crc kubenswrapper[4902]: I1009 14:40:46.513393 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:40:46 crc kubenswrapper[4902]: E1009 14:40:46.514361 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:40:59 crc kubenswrapper[4902]: I1009 14:40:59.513370 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:40:59 crc kubenswrapper[4902]: E1009 14:40:59.514091 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:41:12 crc kubenswrapper[4902]: I1009 14:41:12.513397 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:41:12 crc kubenswrapper[4902]: E1009 14:41:12.514244 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:41:27 crc kubenswrapper[4902]: I1009 14:41:27.542256 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:41:27 crc kubenswrapper[4902]: E1009 14:41:27.543068 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:41:41 crc kubenswrapper[4902]: I1009 14:41:41.514297 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:41:41 crc kubenswrapper[4902]: E1009 14:41:41.515172 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:41:52 crc kubenswrapper[4902]: I1009 14:41:52.513212 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:41:52 crc kubenswrapper[4902]: E1009 14:41:52.514075 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:42:07 crc kubenswrapper[4902]: I1009 14:42:07.514823 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:42:07 crc kubenswrapper[4902]: E1009 14:42:07.515647 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:42:22 crc kubenswrapper[4902]: I1009 14:42:22.512949 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:42:22 crc kubenswrapper[4902]: E1009 14:42:22.513593 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:42:37 crc kubenswrapper[4902]: I1009 14:42:37.513278 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:42:37 crc kubenswrapper[4902]: E1009 14:42:37.514023 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:42:48 crc kubenswrapper[4902]: I1009 14:42:48.512802 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:42:48 crc kubenswrapper[4902]: E1009 14:42:48.513536 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:43:01 crc kubenswrapper[4902]: I1009 14:43:01.514311 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:43:01 crc kubenswrapper[4902]: E1009 14:43:01.515017 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.003594 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:10 crc kubenswrapper[4902]: E1009 14:43:10.004438 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="registry-server" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.004454 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="registry-server" Oct 09 14:43:10 crc kubenswrapper[4902]: E1009 14:43:10.004480 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="extract-utilities" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.004488 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="extract-utilities" Oct 09 14:43:10 crc kubenswrapper[4902]: E1009 14:43:10.004507 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="extract-content" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.004515 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="extract-content" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.004763 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b7b3c1-27a1-4fa2-ad98-05d2904d292c" containerName="registry-server" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.006083 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.014777 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.084978 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj9t6\" (UniqueName: \"kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.085086 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.085255 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.187636 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.187721 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj9t6\" (UniqueName: \"kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.187785 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.188346 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.188359 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.206941 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj9t6\" (UniqueName: \"kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6\") pod \"certified-operators-j7khm\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.327799 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:10 crc kubenswrapper[4902]: I1009 14:43:10.799329 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:11 crc kubenswrapper[4902]: I1009 14:43:11.471480 4902 generic.go:334] "Generic (PLEG): container finished" podID="e4646ab1-25d3-4523-a237-e7593820e775" containerID="196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482" exitCode=0 Oct 09 14:43:11 crc kubenswrapper[4902]: I1009 14:43:11.471748 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerDied","Data":"196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482"} Oct 09 14:43:11 crc kubenswrapper[4902]: I1009 14:43:11.471774 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerStarted","Data":"bef6d87a47e225fb4c5f78e35babb514f83b33550d0f360dce900c34ae8774c0"} Oct 09 14:43:11 crc kubenswrapper[4902]: I1009 14:43:11.473324 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:43:12 crc kubenswrapper[4902]: I1009 14:43:12.483217 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerStarted","Data":"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db"} Oct 09 14:43:13 crc kubenswrapper[4902]: I1009 14:43:13.493591 4902 generic.go:334] "Generic (PLEG): container finished" podID="e4646ab1-25d3-4523-a237-e7593820e775" containerID="ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db" exitCode=0 Oct 09 14:43:13 crc kubenswrapper[4902]: I1009 14:43:13.493672 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerDied","Data":"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db"} Oct 09 14:43:14 crc kubenswrapper[4902]: I1009 14:43:14.504346 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerStarted","Data":"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe"} Oct 09 14:43:14 crc kubenswrapper[4902]: I1009 14:43:14.529990 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j7khm" podStartSLOduration=2.936771544 podStartE2EDuration="5.529968797s" podCreationTimestamp="2025-10-09 14:43:09 +0000 UTC" firstStartedPulling="2025-10-09 14:43:11.473141835 +0000 UTC m=+3138.671000899" lastFinishedPulling="2025-10-09 14:43:14.066339088 +0000 UTC m=+3141.264198152" observedRunningTime="2025-10-09 14:43:14.521770802 +0000 UTC m=+3141.719629876" watchObservedRunningTime="2025-10-09 14:43:14.529968797 +0000 UTC m=+3141.727827861" Oct 09 14:43:15 crc kubenswrapper[4902]: I1009 14:43:15.512821 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:43:15 crc kubenswrapper[4902]: E1009 14:43:15.513618 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:43:20 crc kubenswrapper[4902]: I1009 14:43:20.328581 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:20 crc kubenswrapper[4902]: I1009 14:43:20.329203 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:20 crc kubenswrapper[4902]: I1009 14:43:20.383028 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:20 crc kubenswrapper[4902]: I1009 14:43:20.629034 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:20 crc kubenswrapper[4902]: I1009 14:43:20.686023 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:22 crc kubenswrapper[4902]: I1009 14:43:22.597774 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j7khm" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="registry-server" containerID="cri-o://7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe" gracePeriod=2 Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.085883 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.237600 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities\") pod \"e4646ab1-25d3-4523-a237-e7593820e775\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.237800 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content\") pod \"e4646ab1-25d3-4523-a237-e7593820e775\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.237862 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj9t6\" (UniqueName: \"kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6\") pod \"e4646ab1-25d3-4523-a237-e7593820e775\" (UID: \"e4646ab1-25d3-4523-a237-e7593820e775\") " Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.238802 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities" (OuterVolumeSpecName: "utilities") pod "e4646ab1-25d3-4523-a237-e7593820e775" (UID: "e4646ab1-25d3-4523-a237-e7593820e775"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.245019 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6" (OuterVolumeSpecName: "kube-api-access-kj9t6") pod "e4646ab1-25d3-4523-a237-e7593820e775" (UID: "e4646ab1-25d3-4523-a237-e7593820e775"). InnerVolumeSpecName "kube-api-access-kj9t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.286466 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4646ab1-25d3-4523-a237-e7593820e775" (UID: "e4646ab1-25d3-4523-a237-e7593820e775"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.340473 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.340509 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4646ab1-25d3-4523-a237-e7593820e775-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.340521 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj9t6\" (UniqueName: \"kubernetes.io/projected/e4646ab1-25d3-4523-a237-e7593820e775-kube-api-access-kj9t6\") on node \"crc\" DevicePath \"\"" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.609861 4902 generic.go:334] "Generic (PLEG): container finished" podID="e4646ab1-25d3-4523-a237-e7593820e775" containerID="7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe" exitCode=0 Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.609919 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerDied","Data":"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe"} Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.609951 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j7khm" event={"ID":"e4646ab1-25d3-4523-a237-e7593820e775","Type":"ContainerDied","Data":"bef6d87a47e225fb4c5f78e35babb514f83b33550d0f360dce900c34ae8774c0"} Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.609971 4902 scope.go:117] "RemoveContainer" containerID="7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.610142 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j7khm" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.632312 4902 scope.go:117] "RemoveContainer" containerID="ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.632802 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.640868 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j7khm"] Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.655592 4902 scope.go:117] "RemoveContainer" containerID="196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.699273 4902 scope.go:117] "RemoveContainer" containerID="7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe" Oct 09 14:43:23 crc kubenswrapper[4902]: E1009 14:43:23.699826 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe\": container with ID starting with 7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe not found: ID does not exist" containerID="7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.699870 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe"} err="failed to get container status \"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe\": rpc error: code = NotFound desc = could not find container \"7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe\": container with ID starting with 7ba8ba6fd182173cc905c27d383454dbb03e928b212d81e5dcd076d43287c1fe not found: ID does not exist" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.699899 4902 scope.go:117] "RemoveContainer" containerID="ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db" Oct 09 14:43:23 crc kubenswrapper[4902]: E1009 14:43:23.700257 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db\": container with ID starting with ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db not found: ID does not exist" containerID="ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.700286 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db"} err="failed to get container status \"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db\": rpc error: code = NotFound desc = could not find container \"ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db\": container with ID starting with ba4a73cf4c7d782a9b0b085ea71cab4c6530a8841a48c31d54cecec0832608db not found: ID does not exist" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.700310 4902 scope.go:117] "RemoveContainer" containerID="196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482" Oct 09 14:43:23 crc kubenswrapper[4902]: E1009 14:43:23.700624 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482\": container with ID starting with 196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482 not found: ID does not exist" containerID="196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482" Oct 09 14:43:23 crc kubenswrapper[4902]: I1009 14:43:23.700655 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482"} err="failed to get container status \"196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482\": rpc error: code = NotFound desc = could not find container \"196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482\": container with ID starting with 196d7555f83cafe4a9f2a17b821589a91160e42f7ef90f3ecb557710aa965482 not found: ID does not exist" Oct 09 14:43:25 crc kubenswrapper[4902]: I1009 14:43:25.551280 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4646ab1-25d3-4523-a237-e7593820e775" path="/var/lib/kubelet/pods/e4646ab1-25d3-4523-a237-e7593820e775/volumes" Oct 09 14:43:28 crc kubenswrapper[4902]: I1009 14:43:28.513284 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:43:28 crc kubenswrapper[4902]: E1009 14:43:28.513921 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:43:39 crc kubenswrapper[4902]: I1009 14:43:39.513567 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:43:39 crc kubenswrapper[4902]: E1009 14:43:39.514544 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:43:50 crc kubenswrapper[4902]: I1009 14:43:50.513571 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:43:50 crc kubenswrapper[4902]: E1009 14:43:50.514324 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:44:02 crc kubenswrapper[4902]: I1009 14:44:02.513038 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:44:02 crc kubenswrapper[4902]: E1009 14:44:02.514191 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:44:17 crc kubenswrapper[4902]: I1009 14:44:17.513219 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:44:17 crc kubenswrapper[4902]: E1009 14:44:17.514066 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.512299 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:27 crc kubenswrapper[4902]: E1009 14:44:27.513483 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="extract-content" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.513502 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="extract-content" Oct 09 14:44:27 crc kubenswrapper[4902]: E1009 14:44:27.513519 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="registry-server" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.513525 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="registry-server" Oct 09 14:44:27 crc kubenswrapper[4902]: E1009 14:44:27.513557 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="extract-utilities" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.513564 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="extract-utilities" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.513803 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4646ab1-25d3-4523-a237-e7593820e775" containerName="registry-server" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.515456 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.557118 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.662160 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.662275 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.662477 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbv5\" (UniqueName: \"kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.765365 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbv5\" (UniqueName: \"kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.765547 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.765627 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.766342 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.766364 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.789006 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbv5\" (UniqueName: \"kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5\") pod \"community-operators-vkpqd\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:27 crc kubenswrapper[4902]: I1009 14:44:27.855581 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:28 crc kubenswrapper[4902]: I1009 14:44:28.426866 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:29 crc kubenswrapper[4902]: I1009 14:44:29.212777 4902 generic.go:334] "Generic (PLEG): container finished" podID="ca688e78-1c95-475a-a4ab-f966811712a3" containerID="8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7" exitCode=0 Oct 09 14:44:29 crc kubenswrapper[4902]: I1009 14:44:29.212874 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerDied","Data":"8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7"} Oct 09 14:44:29 crc kubenswrapper[4902]: I1009 14:44:29.213105 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerStarted","Data":"e5e08bbbd4edf78a84eb197e663aa70d5b6a016874adc3732dd5106a1f00e2a8"} Oct 09 14:44:29 crc kubenswrapper[4902]: I1009 14:44:29.514335 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:44:30 crc kubenswrapper[4902]: I1009 14:44:30.224074 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e"} Oct 09 14:44:30 crc kubenswrapper[4902]: I1009 14:44:30.235458 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerStarted","Data":"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1"} Oct 09 14:44:31 crc kubenswrapper[4902]: I1009 14:44:31.247807 4902 generic.go:334] "Generic (PLEG): container finished" podID="ca688e78-1c95-475a-a4ab-f966811712a3" containerID="a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1" exitCode=0 Oct 09 14:44:31 crc kubenswrapper[4902]: I1009 14:44:31.247963 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerDied","Data":"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1"} Oct 09 14:44:32 crc kubenswrapper[4902]: I1009 14:44:32.260098 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerStarted","Data":"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1"} Oct 09 14:44:32 crc kubenswrapper[4902]: I1009 14:44:32.278938 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkpqd" podStartSLOduration=2.788709442 podStartE2EDuration="5.278918212s" podCreationTimestamp="2025-10-09 14:44:27 +0000 UTC" firstStartedPulling="2025-10-09 14:44:29.215100109 +0000 UTC m=+3216.412959193" lastFinishedPulling="2025-10-09 14:44:31.705308899 +0000 UTC m=+3218.903167963" observedRunningTime="2025-10-09 14:44:32.276611376 +0000 UTC m=+3219.474470460" watchObservedRunningTime="2025-10-09 14:44:32.278918212 +0000 UTC m=+3219.476777276" Oct 09 14:44:37 crc kubenswrapper[4902]: I1009 14:44:37.856615 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:37 crc kubenswrapper[4902]: I1009 14:44:37.857364 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:37 crc kubenswrapper[4902]: I1009 14:44:37.910256 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:38 crc kubenswrapper[4902]: I1009 14:44:38.370803 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:38 crc kubenswrapper[4902]: I1009 14:44:38.432687 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.338771 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkpqd" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="registry-server" containerID="cri-o://68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1" gracePeriod=2 Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.827979 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.926527 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities\") pod \"ca688e78-1c95-475a-a4ab-f966811712a3\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.926625 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzbv5\" (UniqueName: \"kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5\") pod \"ca688e78-1c95-475a-a4ab-f966811712a3\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.926708 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content\") pod \"ca688e78-1c95-475a-a4ab-f966811712a3\" (UID: \"ca688e78-1c95-475a-a4ab-f966811712a3\") " Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.927552 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities" (OuterVolumeSpecName: "utilities") pod "ca688e78-1c95-475a-a4ab-f966811712a3" (UID: "ca688e78-1c95-475a-a4ab-f966811712a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.932970 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5" (OuterVolumeSpecName: "kube-api-access-xzbv5") pod "ca688e78-1c95-475a-a4ab-f966811712a3" (UID: "ca688e78-1c95-475a-a4ab-f966811712a3"). InnerVolumeSpecName "kube-api-access-xzbv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:44:40 crc kubenswrapper[4902]: I1009 14:44:40.977881 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca688e78-1c95-475a-a4ab-f966811712a3" (UID: "ca688e78-1c95-475a-a4ab-f966811712a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.029741 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzbv5\" (UniqueName: \"kubernetes.io/projected/ca688e78-1c95-475a-a4ab-f966811712a3-kube-api-access-xzbv5\") on node \"crc\" DevicePath \"\"" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.029796 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.029810 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca688e78-1c95-475a-a4ab-f966811712a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.352318 4902 generic.go:334] "Generic (PLEG): container finished" podID="ca688e78-1c95-475a-a4ab-f966811712a3" containerID="68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1" exitCode=0 Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.352359 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerDied","Data":"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1"} Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.352389 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkpqd" event={"ID":"ca688e78-1c95-475a-a4ab-f966811712a3","Type":"ContainerDied","Data":"e5e08bbbd4edf78a84eb197e663aa70d5b6a016874adc3732dd5106a1f00e2a8"} Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.352405 4902 scope.go:117] "RemoveContainer" containerID="68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.352509 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkpqd" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.375509 4902 scope.go:117] "RemoveContainer" containerID="a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.392091 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.400645 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkpqd"] Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.405289 4902 scope.go:117] "RemoveContainer" containerID="8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.443719 4902 scope.go:117] "RemoveContainer" containerID="68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1" Oct 09 14:44:41 crc kubenswrapper[4902]: E1009 14:44:41.444163 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1\": container with ID starting with 68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1 not found: ID does not exist" containerID="68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.444225 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1"} err="failed to get container status \"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1\": rpc error: code = NotFound desc = could not find container \"68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1\": container with ID starting with 68f34acdc4b39a9f87025864d24ff671d881bef129f51f5f491f1c7c64d2b0d1 not found: ID does not exist" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.444259 4902 scope.go:117] "RemoveContainer" containerID="a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1" Oct 09 14:44:41 crc kubenswrapper[4902]: E1009 14:44:41.444991 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1\": container with ID starting with a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1 not found: ID does not exist" containerID="a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.445027 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1"} err="failed to get container status \"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1\": rpc error: code = NotFound desc = could not find container \"a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1\": container with ID starting with a76e8c4ae02344f206a72dfc2f89c632209c4500b0c4d98b94297253605f72a1 not found: ID does not exist" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.445049 4902 scope.go:117] "RemoveContainer" containerID="8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7" Oct 09 14:44:41 crc kubenswrapper[4902]: E1009 14:44:41.445494 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7\": container with ID starting with 8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7 not found: ID does not exist" containerID="8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.445527 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7"} err="failed to get container status \"8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7\": rpc error: code = NotFound desc = could not find container \"8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7\": container with ID starting with 8c2c01b5e3d17e06e7d3f89cf87ba6a170cb9c3ccd5a6bddfe6fe835373808d7 not found: ID does not exist" Oct 09 14:44:41 crc kubenswrapper[4902]: I1009 14:44:41.524479 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" path="/var/lib/kubelet/pods/ca688e78-1c95-475a-a4ab-f966811712a3/volumes" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.196799 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7"] Oct 09 14:45:00 crc kubenswrapper[4902]: E1009 14:45:00.197841 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="extract-content" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.197859 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="extract-content" Oct 09 14:45:00 crc kubenswrapper[4902]: E1009 14:45:00.197876 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="registry-server" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.197883 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="registry-server" Oct 09 14:45:00 crc kubenswrapper[4902]: E1009 14:45:00.197896 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="extract-utilities" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.197904 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="extract-utilities" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.198143 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca688e78-1c95-475a-a4ab-f966811712a3" containerName="registry-server" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.199130 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.201641 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.203095 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.215764 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7"] Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.306975 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.307436 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7k5\" (UniqueName: \"kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.307651 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.409736 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7k5\" (UniqueName: \"kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.411062 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.411328 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.412467 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.420388 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.431363 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7k5\" (UniqueName: \"kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5\") pod \"collect-profiles-29333685-t4rw7\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.531275 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:00 crc kubenswrapper[4902]: I1009 14:45:00.989276 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7"] Oct 09 14:45:01 crc kubenswrapper[4902]: I1009 14:45:01.542398 4902 generic.go:334] "Generic (PLEG): container finished" podID="a4877705-296f-4392-8ac7-c6b43e3b8529" containerID="cd2a2413a95d94c739b607a70773d59a0ef93c26a0442537bd3cea43ba272ede" exitCode=0 Oct 09 14:45:01 crc kubenswrapper[4902]: I1009 14:45:01.542459 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" event={"ID":"a4877705-296f-4392-8ac7-c6b43e3b8529","Type":"ContainerDied","Data":"cd2a2413a95d94c739b607a70773d59a0ef93c26a0442537bd3cea43ba272ede"} Oct 09 14:45:01 crc kubenswrapper[4902]: I1009 14:45:01.543555 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" event={"ID":"a4877705-296f-4392-8ac7-c6b43e3b8529","Type":"ContainerStarted","Data":"9684625122eb2c162368e7a5716f984cfe684bc91ff68f1a2538b1b6aaebf476"} Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.902435 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.962456 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume\") pod \"a4877705-296f-4392-8ac7-c6b43e3b8529\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.962541 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7k5\" (UniqueName: \"kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5\") pod \"a4877705-296f-4392-8ac7-c6b43e3b8529\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.962645 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume\") pod \"a4877705-296f-4392-8ac7-c6b43e3b8529\" (UID: \"a4877705-296f-4392-8ac7-c6b43e3b8529\") " Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.963345 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume" (OuterVolumeSpecName: "config-volume") pod "a4877705-296f-4392-8ac7-c6b43e3b8529" (UID: "a4877705-296f-4392-8ac7-c6b43e3b8529"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.968506 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5" (OuterVolumeSpecName: "kube-api-access-ck7k5") pod "a4877705-296f-4392-8ac7-c6b43e3b8529" (UID: "a4877705-296f-4392-8ac7-c6b43e3b8529"). InnerVolumeSpecName "kube-api-access-ck7k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:45:02 crc kubenswrapper[4902]: I1009 14:45:02.968670 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a4877705-296f-4392-8ac7-c6b43e3b8529" (UID: "a4877705-296f-4392-8ac7-c6b43e3b8529"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.064528 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4877705-296f-4392-8ac7-c6b43e3b8529-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.064569 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4877705-296f-4392-8ac7-c6b43e3b8529-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.064581 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck7k5\" (UniqueName: \"kubernetes.io/projected/a4877705-296f-4392-8ac7-c6b43e3b8529-kube-api-access-ck7k5\") on node \"crc\" DevicePath \"\"" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.572719 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" event={"ID":"a4877705-296f-4392-8ac7-c6b43e3b8529","Type":"ContainerDied","Data":"9684625122eb2c162368e7a5716f984cfe684bc91ff68f1a2538b1b6aaebf476"} Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.572756 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9684625122eb2c162368e7a5716f984cfe684bc91ff68f1a2538b1b6aaebf476" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.572771 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333685-t4rw7" Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.983903 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7"] Oct 09 14:45:03 crc kubenswrapper[4902]: I1009 14:45:03.994567 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333640-dlvs7"] Oct 09 14:45:05 crc kubenswrapper[4902]: I1009 14:45:05.532025 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10411e0c-6c14-4ede-9c44-e252a84a39cb" path="/var/lib/kubelet/pods/10411e0c-6c14-4ede-9c44-e252a84a39cb/volumes" Oct 09 14:46:05 crc kubenswrapper[4902]: I1009 14:46:05.160347 4902 scope.go:117] "RemoveContainer" containerID="5e16d5daf85e63114be5f50fdb22be47d97d269e7269eff41d9878902d78f21f" Oct 09 14:46:50 crc kubenswrapper[4902]: I1009 14:46:50.078303 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:46:50 crc kubenswrapper[4902]: I1009 14:46:50.078884 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.272570 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:18 crc kubenswrapper[4902]: E1009 14:47:18.273668 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4877705-296f-4392-8ac7-c6b43e3b8529" containerName="collect-profiles" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.273687 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4877705-296f-4392-8ac7-c6b43e3b8529" containerName="collect-profiles" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.273971 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4877705-296f-4392-8ac7-c6b43e3b8529" containerName="collect-profiles" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.275778 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.288032 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.434904 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.435089 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.435243 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbvh\" (UniqueName: \"kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.537683 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.537782 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.537870 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbvh\" (UniqueName: \"kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.538282 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.538282 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.560461 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbvh\" (UniqueName: \"kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh\") pod \"redhat-operators-wsqfj\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:18 crc kubenswrapper[4902]: I1009 14:47:18.602396 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:19 crc kubenswrapper[4902]: I1009 14:47:19.103753 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:19 crc kubenswrapper[4902]: I1009 14:47:19.804380 4902 generic.go:334] "Generic (PLEG): container finished" podID="78eb6f26-575c-4421-938c-60c9a65da811" containerID="a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c" exitCode=0 Oct 09 14:47:19 crc kubenswrapper[4902]: I1009 14:47:19.804463 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerDied","Data":"a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c"} Oct 09 14:47:19 crc kubenswrapper[4902]: I1009 14:47:19.804897 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerStarted","Data":"621e22c0a9710b262bcd01e7e9250bc289cac6870b8cbca20cae526326b9d369"} Oct 09 14:47:20 crc kubenswrapper[4902]: I1009 14:47:20.078142 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:47:20 crc kubenswrapper[4902]: I1009 14:47:20.078210 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:47:20 crc kubenswrapper[4902]: I1009 14:47:20.817366 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerStarted","Data":"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a"} Oct 09 14:47:21 crc kubenswrapper[4902]: I1009 14:47:21.828536 4902 generic.go:334] "Generic (PLEG): container finished" podID="78eb6f26-575c-4421-938c-60c9a65da811" containerID="fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a" exitCode=0 Oct 09 14:47:21 crc kubenswrapper[4902]: I1009 14:47:21.828595 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerDied","Data":"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a"} Oct 09 14:47:22 crc kubenswrapper[4902]: I1009 14:47:22.839680 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerStarted","Data":"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc"} Oct 09 14:47:22 crc kubenswrapper[4902]: I1009 14:47:22.862868 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wsqfj" podStartSLOduration=2.345975778 podStartE2EDuration="4.86284813s" podCreationTimestamp="2025-10-09 14:47:18 +0000 UTC" firstStartedPulling="2025-10-09 14:47:19.806249953 +0000 UTC m=+3387.004109017" lastFinishedPulling="2025-10-09 14:47:22.323122305 +0000 UTC m=+3389.520981369" observedRunningTime="2025-10-09 14:47:22.855973422 +0000 UTC m=+3390.053832506" watchObservedRunningTime="2025-10-09 14:47:22.86284813 +0000 UTC m=+3390.060707194" Oct 09 14:47:28 crc kubenswrapper[4902]: I1009 14:47:28.603016 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:28 crc kubenswrapper[4902]: I1009 14:47:28.603525 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:28 crc kubenswrapper[4902]: I1009 14:47:28.668165 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:28 crc kubenswrapper[4902]: I1009 14:47:28.947188 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:29 crc kubenswrapper[4902]: I1009 14:47:29.003306 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:30 crc kubenswrapper[4902]: I1009 14:47:30.917360 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wsqfj" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="registry-server" containerID="cri-o://c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc" gracePeriod=2 Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.393984 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.495313 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities\") pod \"78eb6f26-575c-4421-938c-60c9a65da811\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.495493 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kbvh\" (UniqueName: \"kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh\") pod \"78eb6f26-575c-4421-938c-60c9a65da811\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.495578 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content\") pod \"78eb6f26-575c-4421-938c-60c9a65da811\" (UID: \"78eb6f26-575c-4421-938c-60c9a65da811\") " Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.496571 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities" (OuterVolumeSpecName: "utilities") pod "78eb6f26-575c-4421-938c-60c9a65da811" (UID: "78eb6f26-575c-4421-938c-60c9a65da811"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.503694 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh" (OuterVolumeSpecName: "kube-api-access-8kbvh") pod "78eb6f26-575c-4421-938c-60c9a65da811" (UID: "78eb6f26-575c-4421-938c-60c9a65da811"). InnerVolumeSpecName "kube-api-access-8kbvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.599233 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.599275 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kbvh\" (UniqueName: \"kubernetes.io/projected/78eb6f26-575c-4421-938c-60c9a65da811-kube-api-access-8kbvh\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.936958 4902 generic.go:334] "Generic (PLEG): container finished" podID="78eb6f26-575c-4421-938c-60c9a65da811" containerID="c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc" exitCode=0 Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.937047 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerDied","Data":"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc"} Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.937081 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsqfj" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.937120 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsqfj" event={"ID":"78eb6f26-575c-4421-938c-60c9a65da811","Type":"ContainerDied","Data":"621e22c0a9710b262bcd01e7e9250bc289cac6870b8cbca20cae526326b9d369"} Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.937163 4902 scope.go:117] "RemoveContainer" containerID="c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc" Oct 09 14:47:31 crc kubenswrapper[4902]: I1009 14:47:31.966949 4902 scope.go:117] "RemoveContainer" containerID="fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.003663 4902 scope.go:117] "RemoveContainer" containerID="a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.044961 4902 scope.go:117] "RemoveContainer" containerID="c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc" Oct 09 14:47:32 crc kubenswrapper[4902]: E1009 14:47:32.045597 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc\": container with ID starting with c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc not found: ID does not exist" containerID="c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.045656 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc"} err="failed to get container status \"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc\": rpc error: code = NotFound desc = could not find container \"c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc\": container with ID starting with c1c512fd9eb1b084d9c71561cbd4d9b263e585dd2b4dc1fda07b2c9d57c8f4dc not found: ID does not exist" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.045695 4902 scope.go:117] "RemoveContainer" containerID="fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a" Oct 09 14:47:32 crc kubenswrapper[4902]: E1009 14:47:32.046253 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a\": container with ID starting with fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a not found: ID does not exist" containerID="fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.046309 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a"} err="failed to get container status \"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a\": rpc error: code = NotFound desc = could not find container \"fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a\": container with ID starting with fd4ace2219304f769845b4f66ec8ca4d7a1e4da9c24521af253c2f6ea9312a9a not found: ID does not exist" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.046347 4902 scope.go:117] "RemoveContainer" containerID="a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c" Oct 09 14:47:32 crc kubenswrapper[4902]: E1009 14:47:32.046768 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c\": container with ID starting with a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c not found: ID does not exist" containerID="a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.046828 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c"} err="failed to get container status \"a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c\": rpc error: code = NotFound desc = could not find container \"a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c\": container with ID starting with a5dbc39c9b390ef6f8e424be72ab65447065d0225d1c7b8f7e65f0203353e17c not found: ID does not exist" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.325037 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78eb6f26-575c-4421-938c-60c9a65da811" (UID: "78eb6f26-575c-4421-938c-60c9a65da811"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.416923 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eb6f26-575c-4421-938c-60c9a65da811-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.569532 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:32 crc kubenswrapper[4902]: I1009 14:47:32.578183 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wsqfj"] Oct 09 14:47:33 crc kubenswrapper[4902]: I1009 14:47:33.546272 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78eb6f26-575c-4421-938c-60c9a65da811" path="/var/lib/kubelet/pods/78eb6f26-575c-4421-938c-60c9a65da811/volumes" Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.079011 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.079799 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.079857 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.080760 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.080826 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e" gracePeriod=600 Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.121820 4902 generic.go:334] "Generic (PLEG): container finished" podID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" containerID="0c95617a0869641f0fc4d0a135d29842ae4868039be79703cd8b5ad72266dbd1" exitCode=0 Oct 09 14:47:50 crc kubenswrapper[4902]: I1009 14:47:50.121866 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee0ede17-c9e7-40c7-b2da-ac04b4df9010","Type":"ContainerDied","Data":"0c95617a0869641f0fc4d0a135d29842ae4868039be79703cd8b5ad72266dbd1"} Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.133984 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e" exitCode=0 Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.134114 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e"} Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.135659 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726"} Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.135692 4902 scope.go:117] "RemoveContainer" containerID="abd517e906ef4a75c8e7630d185960f1c3b3110165f7d059940e5efa29855570" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.505989 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.608832 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-298cw\" (UniqueName: \"kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.608877 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.608937 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609024 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609041 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609062 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609101 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609124 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609200 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\" (UID: \"ee0ede17-c9e7-40c7-b2da-ac04b4df9010\") " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609670 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609945 4902 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.609960 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data" (OuterVolumeSpecName: "config-data") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.615582 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.616668 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw" (OuterVolumeSpecName: "kube-api-access-298cw") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "kube-api-access-298cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.617557 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.639898 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.640400 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.640781 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.659314 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ee0ede17-c9e7-40c7-b2da-ac04b4df9010" (UID: "ee0ede17-c9e7-40c7-b2da-ac04b4df9010"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712652 4902 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712697 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-298cw\" (UniqueName: \"kubernetes.io/projected/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-kube-api-access-298cw\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712711 4902 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712722 4902 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712734 4902 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712749 4902 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712761 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.712773 4902 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee0ede17-c9e7-40c7-b2da-ac04b4df9010-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.735998 4902 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 14:47:51 crc kubenswrapper[4902]: I1009 14:47:51.814374 4902 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 14:47:52 crc kubenswrapper[4902]: I1009 14:47:52.157182 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ee0ede17-c9e7-40c7-b2da-ac04b4df9010","Type":"ContainerDied","Data":"b06ddfc8c82bb0d78607fa316e5be6fb39f6179e746d000f1888c5d4da1ff49e"} Oct 09 14:47:52 crc kubenswrapper[4902]: I1009 14:47:52.157232 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06ddfc8c82bb0d78607fa316e5be6fb39f6179e746d000f1888c5d4da1ff49e" Oct 09 14:47:52 crc kubenswrapper[4902]: I1009 14:47:52.157330 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.373102 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 14:48:00 crc kubenswrapper[4902]: E1009 14:48:00.374726 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="extract-utilities" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.374750 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="extract-utilities" Oct 09 14:48:00 crc kubenswrapper[4902]: E1009 14:48:00.374794 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="registry-server" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.374803 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="registry-server" Oct 09 14:48:00 crc kubenswrapper[4902]: E1009 14:48:00.374827 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="extract-content" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.374835 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="extract-content" Oct 09 14:48:00 crc kubenswrapper[4902]: E1009 14:48:00.374857 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" containerName="tempest-tests-tempest-tests-runner" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.374865 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" containerName="tempest-tests-tempest-tests-runner" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.375159 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eb6f26-575c-4421-938c-60c9a65da811" containerName="registry-server" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.375188 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0ede17-c9e7-40c7-b2da-ac04b4df9010" containerName="tempest-tests-tempest-tests-runner" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.376332 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.378757 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xqzs5" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.386692 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.490727 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqbfj\" (UniqueName: \"kubernetes.io/projected/b2bc8eb5-06bc-4813-86fd-e96c9f53fd94-kube-api-access-dqbfj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.490960 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.593155 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.593478 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqbfj\" (UniqueName: \"kubernetes.io/projected/b2bc8eb5-06bc-4813-86fd-e96c9f53fd94-kube-api-access-dqbfj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.595222 4902 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.639483 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqbfj\" (UniqueName: \"kubernetes.io/projected/b2bc8eb5-06bc-4813-86fd-e96c9f53fd94-kube-api-access-dqbfj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.647564 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:00 crc kubenswrapper[4902]: I1009 14:48:00.715687 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 14:48:01 crc kubenswrapper[4902]: I1009 14:48:01.187312 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 14:48:01 crc kubenswrapper[4902]: W1009 14:48:01.192059 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2bc8eb5_06bc_4813_86fd_e96c9f53fd94.slice/crio-105a4c931ab07bbec24ad43b033b26b428c5f1f0a2209ad12514bfdf504879aa WatchSource:0}: Error finding container 105a4c931ab07bbec24ad43b033b26b428c5f1f0a2209ad12514bfdf504879aa: Status 404 returned error can't find the container with id 105a4c931ab07bbec24ad43b033b26b428c5f1f0a2209ad12514bfdf504879aa Oct 09 14:48:01 crc kubenswrapper[4902]: I1009 14:48:01.246767 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94","Type":"ContainerStarted","Data":"105a4c931ab07bbec24ad43b033b26b428c5f1f0a2209ad12514bfdf504879aa"} Oct 09 14:48:03 crc kubenswrapper[4902]: I1009 14:48:03.265373 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b2bc8eb5-06bc-4813-86fd-e96c9f53fd94","Type":"ContainerStarted","Data":"08f8af06cfff4b67da94866169a8209354b1d4156cd1b1828639ecbf019a3061"} Oct 09 14:48:03 crc kubenswrapper[4902]: I1009 14:48:03.287014 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.142918602 podStartE2EDuration="3.286993828s" podCreationTimestamp="2025-10-09 14:48:00 +0000 UTC" firstStartedPulling="2025-10-09 14:48:01.19407704 +0000 UTC m=+3428.391936104" lastFinishedPulling="2025-10-09 14:48:02.338152266 +0000 UTC m=+3429.536011330" observedRunningTime="2025-10-09 14:48:03.282602732 +0000 UTC m=+3430.480461816" watchObservedRunningTime="2025-10-09 14:48:03.286993828 +0000 UTC m=+3430.484852912" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.715099 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c6hql/must-gather-8r5mb"] Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.721038 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.725628 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c6hql"/"kube-root-ca.crt" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.726702 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c6hql"/"default-dockercfg-qkzr7" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.726801 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c6hql"/"openshift-service-ca.crt" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.730555 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c6hql/must-gather-8r5mb"] Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.815652 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgx2l\" (UniqueName: \"kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.815738 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.917657 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgx2l\" (UniqueName: \"kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.917742 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.918351 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:21 crc kubenswrapper[4902]: I1009 14:48:21.937385 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgx2l\" (UniqueName: \"kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l\") pod \"must-gather-8r5mb\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:22 crc kubenswrapper[4902]: I1009 14:48:22.047878 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:48:22 crc kubenswrapper[4902]: I1009 14:48:22.496974 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c6hql/must-gather-8r5mb"] Oct 09 14:48:22 crc kubenswrapper[4902]: I1009 14:48:22.512866 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:48:23 crc kubenswrapper[4902]: I1009 14:48:23.477154 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/must-gather-8r5mb" event={"ID":"d3672176-5fc4-4ea0-a396-1085fc7bba24","Type":"ContainerStarted","Data":"ee8ae97bab85d59f79c7104b244aea931f093a784c854ec2e0d59a6699865565"} Oct 09 14:48:27 crc kubenswrapper[4902]: I1009 14:48:27.581164 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/must-gather-8r5mb" event={"ID":"d3672176-5fc4-4ea0-a396-1085fc7bba24","Type":"ContainerStarted","Data":"11f21813cee8040c66935837033b381ec9086a574d51b8aef2dcee49900f945a"} Oct 09 14:48:27 crc kubenswrapper[4902]: I1009 14:48:27.581730 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/must-gather-8r5mb" event={"ID":"d3672176-5fc4-4ea0-a396-1085fc7bba24","Type":"ContainerStarted","Data":"9a2e452e97ea3a3ada247e6646556cc8b52f24c04d9becac91644c41d5d5732e"} Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.630849 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c6hql/must-gather-8r5mb" podStartSLOduration=5.438354313 podStartE2EDuration="9.630825961s" podCreationTimestamp="2025-10-09 14:48:21 +0000 UTC" firstStartedPulling="2025-10-09 14:48:22.512676826 +0000 UTC m=+3449.710535890" lastFinishedPulling="2025-10-09 14:48:26.705148474 +0000 UTC m=+3453.903007538" observedRunningTime="2025-10-09 14:48:27.601266664 +0000 UTC m=+3454.799125738" watchObservedRunningTime="2025-10-09 14:48:30.630825961 +0000 UTC m=+3457.828685015" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.640857 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c6hql/crc-debug-5lgt8"] Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.642144 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.704868 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9n8\" (UniqueName: \"kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.707495 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.810050 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.810166 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9n8\" (UniqueName: \"kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.810241 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.837165 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9n8\" (UniqueName: \"kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8\") pod \"crc-debug-5lgt8\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: I1009 14:48:30.960620 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:48:30 crc kubenswrapper[4902]: W1009 14:48:30.999667 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61c5fe6_5a9b_4a2c_a16a_110894f4480c.slice/crio-bfb68a4458b54e4108bf8669c2ead9ca3a3131f0ade432f4a099680dd9db8fdb WatchSource:0}: Error finding container bfb68a4458b54e4108bf8669c2ead9ca3a3131f0ade432f4a099680dd9db8fdb: Status 404 returned error can't find the container with id bfb68a4458b54e4108bf8669c2ead9ca3a3131f0ade432f4a099680dd9db8fdb Oct 09 14:48:31 crc kubenswrapper[4902]: I1009 14:48:31.616302 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" event={"ID":"b61c5fe6-5a9b-4a2c-a16a-110894f4480c","Type":"ContainerStarted","Data":"bfb68a4458b54e4108bf8669c2ead9ca3a3131f0ade432f4a099680dd9db8fdb"} Oct 09 14:48:43 crc kubenswrapper[4902]: I1009 14:48:43.772741 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" event={"ID":"b61c5fe6-5a9b-4a2c-a16a-110894f4480c","Type":"ContainerStarted","Data":"ef619fa93af693ecf506e8c47d6188b080647d6ce4f0e93b121b8c375c96c4dd"} Oct 09 14:48:43 crc kubenswrapper[4902]: I1009 14:48:43.785360 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" podStartSLOduration=2.128532161 podStartE2EDuration="13.785343796s" podCreationTimestamp="2025-10-09 14:48:30 +0000 UTC" firstStartedPulling="2025-10-09 14:48:31.002021583 +0000 UTC m=+3458.199880647" lastFinishedPulling="2025-10-09 14:48:42.658833218 +0000 UTC m=+3469.856692282" observedRunningTime="2025-10-09 14:48:43.785226033 +0000 UTC m=+3470.983085117" watchObservedRunningTime="2025-10-09 14:48:43.785343796 +0000 UTC m=+3470.983202860" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.781789 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.784692 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.814912 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.839003 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjzgc\" (UniqueName: \"kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.839063 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.839299 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.941306 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.941835 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjzgc\" (UniqueName: \"kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.941868 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.941913 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.942204 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:14 crc kubenswrapper[4902]: I1009 14:49:14.987987 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjzgc\" (UniqueName: \"kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc\") pod \"redhat-marketplace-kqlxz\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:15 crc kubenswrapper[4902]: I1009 14:49:15.106988 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:15 crc kubenswrapper[4902]: I1009 14:49:15.713851 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:16 crc kubenswrapper[4902]: I1009 14:49:16.073891 4902 generic.go:334] "Generic (PLEG): container finished" podID="8775826a-ed71-4110-8622-5e00cd05356b" containerID="687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3" exitCode=0 Oct 09 14:49:16 crc kubenswrapper[4902]: I1009 14:49:16.074025 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerDied","Data":"687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3"} Oct 09 14:49:16 crc kubenswrapper[4902]: I1009 14:49:16.074932 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerStarted","Data":"afa7c9b7dfc9418e937363bfb024f3fcf1ffc501cb26043472fb77043a782308"} Oct 09 14:49:18 crc kubenswrapper[4902]: I1009 14:49:18.102723 4902 generic.go:334] "Generic (PLEG): container finished" podID="8775826a-ed71-4110-8622-5e00cd05356b" containerID="9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709" exitCode=0 Oct 09 14:49:18 crc kubenswrapper[4902]: I1009 14:49:18.102783 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerDied","Data":"9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709"} Oct 09 14:49:19 crc kubenswrapper[4902]: I1009 14:49:19.114818 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerStarted","Data":"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e"} Oct 09 14:49:19 crc kubenswrapper[4902]: I1009 14:49:19.141238 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqlxz" podStartSLOduration=2.7285107159999997 podStartE2EDuration="5.141212913s" podCreationTimestamp="2025-10-09 14:49:14 +0000 UTC" firstStartedPulling="2025-10-09 14:49:16.075939095 +0000 UTC m=+3503.273798159" lastFinishedPulling="2025-10-09 14:49:18.488641292 +0000 UTC m=+3505.686500356" observedRunningTime="2025-10-09 14:49:19.134615103 +0000 UTC m=+3506.332474177" watchObservedRunningTime="2025-10-09 14:49:19.141212913 +0000 UTC m=+3506.339071977" Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.108488 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.109185 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.163382 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.181210 4902 generic.go:334] "Generic (PLEG): container finished" podID="b61c5fe6-5a9b-4a2c-a16a-110894f4480c" containerID="ef619fa93af693ecf506e8c47d6188b080647d6ce4f0e93b121b8c375c96c4dd" exitCode=0 Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.181596 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" event={"ID":"b61c5fe6-5a9b-4a2c-a16a-110894f4480c","Type":"ContainerDied","Data":"ef619fa93af693ecf506e8c47d6188b080647d6ce4f0e93b121b8c375c96c4dd"} Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.240212 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:25 crc kubenswrapper[4902]: I1009 14:49:25.416289 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.293241 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.346901 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-5lgt8"] Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.360352 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-5lgt8"] Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.374701 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host\") pod \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.374960 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host" (OuterVolumeSpecName: "host") pod "b61c5fe6-5a9b-4a2c-a16a-110894f4480c" (UID: "b61c5fe6-5a9b-4a2c-a16a-110894f4480c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.375752 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9n8\" (UniqueName: \"kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8\") pod \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\" (UID: \"b61c5fe6-5a9b-4a2c-a16a-110894f4480c\") " Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.376772 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.383793 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8" (OuterVolumeSpecName: "kube-api-access-zf9n8") pod "b61c5fe6-5a9b-4a2c-a16a-110894f4480c" (UID: "b61c5fe6-5a9b-4a2c-a16a-110894f4480c"). InnerVolumeSpecName "kube-api-access-zf9n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:49:26 crc kubenswrapper[4902]: I1009 14:49:26.479870 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9n8\" (UniqueName: \"kubernetes.io/projected/b61c5fe6-5a9b-4a2c-a16a-110894f4480c-kube-api-access-zf9n8\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.199640 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb68a4458b54e4108bf8669c2ead9ca3a3131f0ade432f4a099680dd9db8fdb" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.199671 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-5lgt8" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.199793 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqlxz" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="registry-server" containerID="cri-o://99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e" gracePeriod=2 Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.537779 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61c5fe6-5a9b-4a2c-a16a-110894f4480c" path="/var/lib/kubelet/pods/b61c5fe6-5a9b-4a2c-a16a-110894f4480c/volumes" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.600342 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c6hql/crc-debug-mt5lv"] Oct 09 14:49:27 crc kubenswrapper[4902]: E1009 14:49:27.601204 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61c5fe6-5a9b-4a2c-a16a-110894f4480c" containerName="container-00" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.601227 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61c5fe6-5a9b-4a2c-a16a-110894f4480c" containerName="container-00" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.601477 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61c5fe6-5a9b-4a2c-a16a-110894f4480c" containerName="container-00" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.602189 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.702660 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.702761 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qb75\" (UniqueName: \"kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.804867 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.804940 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qb75\" (UniqueName: \"kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.805011 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.811095 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.830171 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qb75\" (UniqueName: \"kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75\") pod \"crc-debug-mt5lv\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.906562 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjzgc\" (UniqueName: \"kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc\") pod \"8775826a-ed71-4110-8622-5e00cd05356b\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.906721 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content\") pod \"8775826a-ed71-4110-8622-5e00cd05356b\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.906850 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities\") pod \"8775826a-ed71-4110-8622-5e00cd05356b\" (UID: \"8775826a-ed71-4110-8622-5e00cd05356b\") " Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.908080 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities" (OuterVolumeSpecName: "utilities") pod "8775826a-ed71-4110-8622-5e00cd05356b" (UID: "8775826a-ed71-4110-8622-5e00cd05356b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.911666 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc" (OuterVolumeSpecName: "kube-api-access-tjzgc") pod "8775826a-ed71-4110-8622-5e00cd05356b" (UID: "8775826a-ed71-4110-8622-5e00cd05356b"). InnerVolumeSpecName "kube-api-access-tjzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.927397 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8775826a-ed71-4110-8622-5e00cd05356b" (UID: "8775826a-ed71-4110-8622-5e00cd05356b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:49:27 crc kubenswrapper[4902]: I1009 14:49:27.940953 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.009183 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.009229 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8775826a-ed71-4110-8622-5e00cd05356b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.009240 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjzgc\" (UniqueName: \"kubernetes.io/projected/8775826a-ed71-4110-8622-5e00cd05356b-kube-api-access-tjzgc\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.215324 4902 generic.go:334] "Generic (PLEG): container finished" podID="8775826a-ed71-4110-8622-5e00cd05356b" containerID="99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e" exitCode=0 Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.215386 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqlxz" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.215382 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerDied","Data":"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e"} Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.216043 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqlxz" event={"ID":"8775826a-ed71-4110-8622-5e00cd05356b","Type":"ContainerDied","Data":"afa7c9b7dfc9418e937363bfb024f3fcf1ffc501cb26043472fb77043a782308"} Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.216110 4902 scope.go:117] "RemoveContainer" containerID="99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.218806 4902 generic.go:334] "Generic (PLEG): container finished" podID="7ce606d7-a383-4cc5-85b8-e723fffda771" containerID="c505ada6c658d045c0a4104b002ab1f1ced2dea3e0c43011d7ff2541aeeb3259" exitCode=0 Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.218840 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" event={"ID":"7ce606d7-a383-4cc5-85b8-e723fffda771","Type":"ContainerDied","Data":"c505ada6c658d045c0a4104b002ab1f1ced2dea3e0c43011d7ff2541aeeb3259"} Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.218871 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" event={"ID":"7ce606d7-a383-4cc5-85b8-e723fffda771","Type":"ContainerStarted","Data":"f48b34ce3a6fe996a57f8e8cf1f18b4c5edb1eeb0cfdc766a698252454266157"} Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.263630 4902 scope.go:117] "RemoveContainer" containerID="9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.275593 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.284122 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqlxz"] Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.297855 4902 scope.go:117] "RemoveContainer" containerID="687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.324095 4902 scope.go:117] "RemoveContainer" containerID="99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e" Oct 09 14:49:28 crc kubenswrapper[4902]: E1009 14:49:28.324849 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e\": container with ID starting with 99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e not found: ID does not exist" containerID="99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.324894 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e"} err="failed to get container status \"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e\": rpc error: code = NotFound desc = could not find container \"99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e\": container with ID starting with 99eb15075f7afec8ffb3d8558bbfccf6c6fdc66c3eefe449f8e3055c183fff8e not found: ID does not exist" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.324925 4902 scope.go:117] "RemoveContainer" containerID="9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709" Oct 09 14:49:28 crc kubenswrapper[4902]: E1009 14:49:28.325180 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709\": container with ID starting with 9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709 not found: ID does not exist" containerID="9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.325230 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709"} err="failed to get container status \"9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709\": rpc error: code = NotFound desc = could not find container \"9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709\": container with ID starting with 9cd3af49ad1375034b35aa340147c063d6662dabbc2aa68e0e5b27d2ed432709 not found: ID does not exist" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.325250 4902 scope.go:117] "RemoveContainer" containerID="687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3" Oct 09 14:49:28 crc kubenswrapper[4902]: E1009 14:49:28.325663 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3\": container with ID starting with 687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3 not found: ID does not exist" containerID="687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.325690 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3"} err="failed to get container status \"687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3\": rpc error: code = NotFound desc = could not find container \"687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3\": container with ID starting with 687800323cf38819754175eb4985361a667b8cb45a4b3b86caf9623615226dc3 not found: ID does not exist" Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.693681 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-mt5lv"] Oct 09 14:49:28 crc kubenswrapper[4902]: I1009 14:49:28.702697 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-mt5lv"] Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.347841 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.440717 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host\") pod \"7ce606d7-a383-4cc5-85b8-e723fffda771\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.440899 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host" (OuterVolumeSpecName: "host") pod "7ce606d7-a383-4cc5-85b8-e723fffda771" (UID: "7ce606d7-a383-4cc5-85b8-e723fffda771"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.441067 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qb75\" (UniqueName: \"kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75\") pod \"7ce606d7-a383-4cc5-85b8-e723fffda771\" (UID: \"7ce606d7-a383-4cc5-85b8-e723fffda771\") " Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.441655 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ce606d7-a383-4cc5-85b8-e723fffda771-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.446586 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75" (OuterVolumeSpecName: "kube-api-access-6qb75") pod "7ce606d7-a383-4cc5-85b8-e723fffda771" (UID: "7ce606d7-a383-4cc5-85b8-e723fffda771"). InnerVolumeSpecName "kube-api-access-6qb75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.524898 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce606d7-a383-4cc5-85b8-e723fffda771" path="/var/lib/kubelet/pods/7ce606d7-a383-4cc5-85b8-e723fffda771/volumes" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.525582 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8775826a-ed71-4110-8622-5e00cd05356b" path="/var/lib/kubelet/pods/8775826a-ed71-4110-8622-5e00cd05356b/volumes" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.543514 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qb75\" (UniqueName: \"kubernetes.io/projected/7ce606d7-a383-4cc5-85b8-e723fffda771-kube-api-access-6qb75\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.875767 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c6hql/crc-debug-kghfg"] Oct 09 14:49:29 crc kubenswrapper[4902]: E1009 14:49:29.876149 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce606d7-a383-4cc5-85b8-e723fffda771" containerName="container-00" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876165 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce606d7-a383-4cc5-85b8-e723fffda771" containerName="container-00" Oct 09 14:49:29 crc kubenswrapper[4902]: E1009 14:49:29.876208 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="registry-server" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876217 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="registry-server" Oct 09 14:49:29 crc kubenswrapper[4902]: E1009 14:49:29.876232 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="extract-content" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876239 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="extract-content" Oct 09 14:49:29 crc kubenswrapper[4902]: E1009 14:49:29.876257 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="extract-utilities" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876263 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="extract-utilities" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876482 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="8775826a-ed71-4110-8622-5e00cd05356b" containerName="registry-server" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.876501 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce606d7-a383-4cc5-85b8-e723fffda771" containerName="container-00" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.877181 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.950124 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvfb6\" (UniqueName: \"kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:29 crc kubenswrapper[4902]: I1009 14:49:29.950179 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.052062 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvfb6\" (UniqueName: \"kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.052131 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.052275 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.070717 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvfb6\" (UniqueName: \"kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6\") pod \"crc-debug-kghfg\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.194477 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.264491 4902 scope.go:117] "RemoveContainer" containerID="c505ada6c658d045c0a4104b002ab1f1ced2dea3e0c43011d7ff2541aeeb3259" Oct 09 14:49:30 crc kubenswrapper[4902]: I1009 14:49:30.264598 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-mt5lv" Oct 09 14:49:31 crc kubenswrapper[4902]: I1009 14:49:31.277980 4902 generic.go:334] "Generic (PLEG): container finished" podID="bdee3074-1302-4111-8c03-c22f94667569" containerID="38d4fa0a3e055c2303b4e198bfeb1d4decddcd47f99fcf9ebb7456882130b7f0" exitCode=0 Oct 09 14:49:31 crc kubenswrapper[4902]: I1009 14:49:31.278578 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-kghfg" event={"ID":"bdee3074-1302-4111-8c03-c22f94667569","Type":"ContainerDied","Data":"38d4fa0a3e055c2303b4e198bfeb1d4decddcd47f99fcf9ebb7456882130b7f0"} Oct 09 14:49:31 crc kubenswrapper[4902]: I1009 14:49:31.278617 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/crc-debug-kghfg" event={"ID":"bdee3074-1302-4111-8c03-c22f94667569","Type":"ContainerStarted","Data":"38a35d2109046f3c8d76d4c0a8f382cba1d7b0bc9922e5369a30445c9fd62984"} Oct 09 14:49:31 crc kubenswrapper[4902]: I1009 14:49:31.318087 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-kghfg"] Oct 09 14:49:31 crc kubenswrapper[4902]: I1009 14:49:31.325870 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c6hql/crc-debug-kghfg"] Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.413128 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.501402 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvfb6\" (UniqueName: \"kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6\") pod \"bdee3074-1302-4111-8c03-c22f94667569\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.501545 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host\") pod \"bdee3074-1302-4111-8c03-c22f94667569\" (UID: \"bdee3074-1302-4111-8c03-c22f94667569\") " Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.501686 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host" (OuterVolumeSpecName: "host") pod "bdee3074-1302-4111-8c03-c22f94667569" (UID: "bdee3074-1302-4111-8c03-c22f94667569"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.501999 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdee3074-1302-4111-8c03-c22f94667569-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.507141 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6" (OuterVolumeSpecName: "kube-api-access-kvfb6") pod "bdee3074-1302-4111-8c03-c22f94667569" (UID: "bdee3074-1302-4111-8c03-c22f94667569"). InnerVolumeSpecName "kube-api-access-kvfb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:49:32 crc kubenswrapper[4902]: I1009 14:49:32.603697 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvfb6\" (UniqueName: \"kubernetes.io/projected/bdee3074-1302-4111-8c03-c22f94667569-kube-api-access-kvfb6\") on node \"crc\" DevicePath \"\"" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.299081 4902 scope.go:117] "RemoveContainer" containerID="38d4fa0a3e055c2303b4e198bfeb1d4decddcd47f99fcf9ebb7456882130b7f0" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.299458 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/crc-debug-kghfg" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.432169 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54f57498cd-cv95r_8e22729c-3eef-405e-bf5a-5654f9795d57/barbican-api/0.log" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.526093 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdee3074-1302-4111-8c03-c22f94667569" path="/var/lib/kubelet/pods/bdee3074-1302-4111-8c03-c22f94667569/volumes" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.594728 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54f57498cd-cv95r_8e22729c-3eef-405e-bf5a-5654f9795d57/barbican-api-log/0.log" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.673609 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbd9b6574-b5dht_11b3f7c7-66a5-485c-922e-b5568e2f9f1c/barbican-keystone-listener/0.log" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.753538 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbd9b6574-b5dht_11b3f7c7-66a5-485c-922e-b5568e2f9f1c/barbican-keystone-listener-log/0.log" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.863147 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-896cb696f-kkg85_fe538ee2-2e8c-406f-8e70-bc56325ec408/barbican-worker/0.log" Oct 09 14:49:33 crc kubenswrapper[4902]: I1009 14:49:33.947026 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-896cb696f-kkg85_fe538ee2-2e8c-406f-8e70-bc56325ec408/barbican-worker-log/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.095796 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n_705cf92b-1b0d-4706-bf30-03fb1a9728cd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.179142 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/ceilometer-central-agent/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.293298 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/ceilometer-notification-agent/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.317177 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/proxy-httpd/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.371603 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/sg-core/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.479247 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d7c2affc-5952-43d0-8629-8e61961bdf1c/cinder-api-log/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.533134 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d7c2affc-5952-43d0-8629-8e61961bdf1c/cinder-api/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.634543 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_676825a2-3e5b-4137-b9fc-337425ff8d09/cinder-scheduler/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.725617 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_676825a2-3e5b-4137-b9fc-337425ff8d09/probe/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.862556 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg_18dcb9ba-f068-421a-a11c-f25f2b7c940a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:34 crc kubenswrapper[4902]: I1009 14:49:34.912120 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4_7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.126115 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h_77b43a36-9858-4efc-aa2b-a56278710389/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.152309 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/init/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.368877 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/init/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.415119 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/dnsmasq-dns/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.478250 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8_f53eb372-afb1-4f71-b4b3-eb4b36483e5e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.675875 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c0515fd-b685-4dab-909a-3f4147e19a59/glance-httpd/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.746441 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c0515fd-b685-4dab-909a-3f4147e19a59/glance-log/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.847248 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df90d442-7261-4353-821c-c0e71a43998a/glance-httpd/0.log" Oct 09 14:49:35 crc kubenswrapper[4902]: I1009 14:49:35.903171 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df90d442-7261-4353-821c-c0e71a43998a/glance-log/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.170157 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-779d95f9fb-tfjvq_40e0f94d-30a4-456b-bfd4-7da1453facc4/horizon/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.264172 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr_813f74d2-a7a6-4e97-983b-544c38995262/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.431813 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-779d95f9fb-tfjvq_40e0f94d-30a4-456b-bfd4-7da1453facc4/horizon-log/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.455824 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lsz7p_9dd56ad2-b36d-4850-81d2-b06db395ecd6/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.696232 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_dde7a697-a373-4e6a-8535-d7768f569e18/kube-state-metrics/0.log" Oct 09 14:49:36 crc kubenswrapper[4902]: I1009 14:49:36.811849 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d55545c5f-lff8v_b32b124f-e090-4e6c-b2b7-138e1059b680/keystone-api/0.log" Oct 09 14:49:37 crc kubenswrapper[4902]: I1009 14:49:37.006284 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw_99af5091-b31f-45c0-abcf-882b0159219f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:37 crc kubenswrapper[4902]: I1009 14:49:37.358779 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65878bc9b7-hv97v_b17f63fc-0163-416e-a3ee-179a1a071560/neutron-httpd/0.log" Oct 09 14:49:37 crc kubenswrapper[4902]: I1009 14:49:37.423583 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65878bc9b7-hv97v_b17f63fc-0163-416e-a3ee-179a1a071560/neutron-api/0.log" Oct 09 14:49:37 crc kubenswrapper[4902]: I1009 14:49:37.509136 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n_1c9a61d2-e7f4-4e22-8b3b-18263b72df09/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.035529 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_61f63f9a-89ab-45fa-b62f-e93f826423a9/nova-cell0-conductor-conductor/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.082756 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7a597676-c413-4919-a79b-ac49dd2671c2/nova-api-log/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.304681 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7a597676-c413-4919-a79b-ac49dd2671c2/nova-api-api/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.371822 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3fe652a7-7ee4-4a55-9f17-a359b82df106/nova-cell1-conductor-conductor/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.390093 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2bc33878-b734-4520-b5e0-e066f53dbe31/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.606144 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-f9m9b_a5fc156b-09f2-4647-a2df-73877fb9db6f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:38 crc kubenswrapper[4902]: I1009 14:49:38.815842 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a272cc22-f1dc-48b7-89ef-e4578877aa78/nova-metadata-log/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.051471 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b1206eaa-959a-4a3d-8c35-a60fc09bb3d5/nova-scheduler-scheduler/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.101119 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/mysql-bootstrap/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.337555 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/mysql-bootstrap/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.345945 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/galera/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.706160 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/mysql-bootstrap/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.920728 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/galera/0.log" Oct 09 14:49:39 crc kubenswrapper[4902]: I1009 14:49:39.986778 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/mysql-bootstrap/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.041448 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a272cc22-f1dc-48b7-89ef-e4578877aa78/nova-metadata-metadata/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.099260 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_594a9127-c741-4bb1-871f-0295abab43ce/openstackclient/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.220535 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-79djm_7f0722a8-eee2-4bb1-a3b4-d14964d35227/ovn-controller/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.319926 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pm8hp_aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb/openstack-network-exporter/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.543056 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server-init/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.686086 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovs-vswitchd/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.699571 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server-init/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.732602 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.927844 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22ef2931-973d-462a-ae3a-d05056c72468/openstack-network-exporter/0.log" Oct 09 14:49:40 crc kubenswrapper[4902]: I1009 14:49:40.987151 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-ndvq8_9478bb2f-ce46-41f9-bfbd-e93ebcb437ed/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.098036 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22ef2931-973d-462a-ae3a-d05056c72468/ovn-northd/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.212189 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea130bce-ed3c-495f-b06b-14278e3133ca/openstack-network-exporter/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.253340 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea130bce-ed3c-495f-b06b-14278e3133ca/ovsdbserver-nb/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.462586 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1676b099-df2c-477b-a05b-b46d47dc3b05/ovsdbserver-sb/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.463460 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1676b099-df2c-477b-a05b-b46d47dc3b05/openstack-network-exporter/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.689190 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-864577bbd8-z8v7t_9e8ddce2-e9c0-4dc6-8bcd-d228188630dc/placement-api/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.749719 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-864577bbd8-z8v7t_9e8ddce2-e9c0-4dc6-8bcd-d228188630dc/placement-log/0.log" Oct 09 14:49:41 crc kubenswrapper[4902]: I1009 14:49:41.816073 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/setup-container/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.043312 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/rabbitmq/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.082576 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/setup-container/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.091486 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/setup-container/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.348012 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/setup-container/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.382919 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/rabbitmq/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.389628 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z_f3d3271d-29ab-4339-8614-a297a2b8791f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.614530 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ghc9w_2cfa6eb9-6c46-4420-a165-d7a1c4d7713a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.647382 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2_dc688ac8-1f96-4a97-adf2-151b28cca357/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.881920 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dss8m_21ce0f21-ea6d-4bed-b68b-2573a40c5443/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:42 crc kubenswrapper[4902]: I1009 14:49:42.887194 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-q5kpf_582550a8-6e34-4a07-97af-70e3770fedcd/ssh-known-hosts-edpm-deployment/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.205935 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6568f7cff-cv7qx_797c027a-6081-4aa8-9643-ddffc4393193/proxy-server/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.246117 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6568f7cff-cv7qx_797c027a-6081-4aa8-9643-ddffc4393193/proxy-httpd/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.383795 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-auditor/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.385118 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-khn8g_014a8355-9817-424e-ae75-b786043b2a4c/swift-ring-rebalance/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.495222 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-reaper/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.638438 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-server/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.653486 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-replicator/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.705777 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-auditor/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.762688 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-replicator/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.840033 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-server/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.909286 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-auditor/0.log" Oct 09 14:49:43 crc kubenswrapper[4902]: I1009 14:49:43.916565 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-updater/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.025071 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-expirer/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.079833 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-server/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.093534 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-replicator/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.146923 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-updater/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.268458 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/rsync/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.326594 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/swift-recon-cron/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.494501 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm_40011150-b1be-4ddc-8ecf-b70c54c98b9c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.576739 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee0ede17-c9e7-40c7-b2da-ac04b4df9010/tempest-tests-tempest-tests-runner/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.697059 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b2bc8eb5-06bc-4813-86fd-e96c9f53fd94/test-operator-logs-container/0.log" Oct 09 14:49:44 crc kubenswrapper[4902]: I1009 14:49:44.830654 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-q62lf_17bda176-986b-468d-b839-a58df9d3cf58/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:49:50 crc kubenswrapper[4902]: I1009 14:49:50.077726 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:49:50 crc kubenswrapper[4902]: I1009 14:49:50.078291 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:49:54 crc kubenswrapper[4902]: I1009 14:49:54.995324 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_590e6023-7dbe-499f-a8ea-4b8c3e24f747/memcached/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.229098 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-r4nn8_ac925db8-cb97-468e-b43f-b219deb78cf6/manager/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.247649 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-r4nn8_ac925db8-cb97-468e-b43f-b219deb78cf6/kube-rbac-proxy/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.408603 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-hrgs4_a1fcc021-b92b-417d-b92c-4e66386e8502/kube-rbac-proxy/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.449911 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-hrgs4_a1fcc021-b92b-417d-b92c-4e66386e8502/manager/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.648985 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-pxsfz_c27e1a63-1155-43eb-9c97-61680f083de0/kube-rbac-proxy/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.650706 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-pxsfz_c27e1a63-1155-43eb-9c97-61680f083de0/manager/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.673728 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.887259 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.896491 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:50:08 crc kubenswrapper[4902]: I1009 14:50:08.943692 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.118398 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.134732 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/extract/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.143780 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.356766 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-2nkqk_6169ab22-9b0b-4bb3-b840-b3eb92d22c0c/kube-rbac-proxy/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.418120 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-2nkqk_6169ab22-9b0b-4bb3-b840-b3eb92d22c0c/manager/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.436929 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-7bjrw_960aab4c-ce86-4753-b848-3367f15d962c/kube-rbac-proxy/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.577765 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-7bjrw_960aab4c-ce86-4753-b848-3367f15d962c/manager/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.631733 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-phbbt_39e518d9-bffd-4421-bc8b-2b333654ff9e/kube-rbac-proxy/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.662073 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-phbbt_39e518d9-bffd-4421-bc8b-2b333654ff9e/manager/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.768008 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6xw4k_92649c6e-71ba-4945-9210-19394d180222/kube-rbac-proxy/0.log" Oct 09 14:50:09 crc kubenswrapper[4902]: I1009 14:50:09.959531 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-ql5w7_dad70f9e-3fb4-41ff-95f4-dc6be5277aa0/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.030742 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-ql5w7_dad70f9e-3fb4-41ff-95f4-dc6be5277aa0/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.073111 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6xw4k_92649c6e-71ba-4945-9210-19394d180222/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.162655 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7bzzg_d11828c7-488d-414a-a024-68a46fca78e1/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.272008 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7bzzg_d11828c7-488d-414a-a024-68a46fca78e1/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.309796 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-kvzf6_4ad4e07d-4f69-4f1f-9886-bde91ec3b735/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.391468 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-kvzf6_4ad4e07d-4f69-4f1f-9886-bde91ec3b735/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.463699 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-hxrth_07f5d370-e69d-41f8-b65a-d25dc8b38de8/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.517512 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-hxrth_07f5d370-e69d-41f8-b65a-d25dc8b38de8/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.659401 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-ld6xx_0dafd5d3-f605-4c75-86ca-8d40831e9cb7/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.710103 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-ld6xx_0dafd5d3-f605-4c75-86ca-8d40831e9cb7/manager/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.803292 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-s7bkj_5be11d13-4feb-4a12-9f9b-69a99d2fa5a4/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.948158 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-mqn8s_cad34d91-d544-4311-a9b3-adb11e4217c0/kube-rbac-proxy/0.log" Oct 09 14:50:10 crc kubenswrapper[4902]: I1009 14:50:10.966133 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-s7bkj_5be11d13-4feb-4a12-9f9b-69a99d2fa5a4/manager/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.036139 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-mqn8s_cad34d91-d544-4311-a9b3-adb11e4217c0/manager/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.126082 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757db22ds_d5375300-657d-4e1d-92af-2107cbc7972f/kube-rbac-proxy/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.154761 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757db22ds_d5375300-657d-4e1d-92af-2107cbc7972f/manager/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.358187 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d5f574b49-xxs9l_cf6f1e72-9e96-4905-a7f4-d88ec796724e/kube-rbac-proxy/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.577684 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-647744f6c-bzqqk_994ae404-6c3b-499c-b51d-e5a0eea83756/kube-rbac-proxy/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.729102 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-647744f6c-bzqqk_994ae404-6c3b-499c-b51d-e5a0eea83756/operator/0.log" Oct 09 14:50:11 crc kubenswrapper[4902]: I1009 14:50:11.837077 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dv7t7_468c32be-1138-4600-bcd2-85aa8b02ec69/registry-server/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.018890 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-dwgtb_3adf1a7b-f2b7-4927-a026-55afe09bc5ab/kube-rbac-proxy/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.130151 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-dwgtb_3adf1a7b-f2b7-4927-a026-55afe09bc5ab/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.242562 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-s6zmh_52ec0675-fab2-43fd-a447-8896de9e78fd/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.252045 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-s6zmh_52ec0675-fab2-43fd-a447-8896de9e78fd/kube-rbac-proxy/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.487930 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l_93ae4a6d-1e42-4e45-8128-61088861873e/operator/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.556357 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d5f574b49-xxs9l_cf6f1e72-9e96-4905-a7f4-d88ec796724e/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.567388 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-xlwpm_d972447c-10cf-4d4b-870d-11e79f6bd98a/kube-rbac-proxy/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.642977 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-xlwpm_d972447c-10cf-4d4b-870d-11e79f6bd98a/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.707786 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-hn9gv_55efad12-5eb1-4c57-bb2f-700ead538209/kube-rbac-proxy/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.797141 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-hn9gv_55efad12-5eb1-4c57-bb2f-700ead538209/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.848157 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-7lct5_183bbfe9-141b-4b7a-adc1-3ea01011ebd7/kube-rbac-proxy/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.881037 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-7lct5_183bbfe9-141b-4b7a-adc1-3ea01011ebd7/manager/0.log" Oct 09 14:50:12 crc kubenswrapper[4902]: I1009 14:50:12.995091 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-9lx7z_be1ef3d2-4c04-4040-9d73-80655f4b9dbb/kube-rbac-proxy/0.log" Oct 09 14:50:13 crc kubenswrapper[4902]: I1009 14:50:13.017495 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-9lx7z_be1ef3d2-4c04-4040-9d73-80655f4b9dbb/manager/0.log" Oct 09 14:50:20 crc kubenswrapper[4902]: I1009 14:50:20.078299 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:50:20 crc kubenswrapper[4902]: I1009 14:50:20.078872 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:50:27 crc kubenswrapper[4902]: I1009 14:50:27.292256 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pqbtl_7ae518f0-243e-4916-89cb-0e621793d4db/control-plane-machine-set-operator/0.log" Oct 09 14:50:27 crc kubenswrapper[4902]: I1009 14:50:27.396779 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bszj2_6f99a811-543c-4b99-a394-9d941401efff/kube-rbac-proxy/0.log" Oct 09 14:50:27 crc kubenswrapper[4902]: I1009 14:50:27.444255 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bszj2_6f99a811-543c-4b99-a394-9d941401efff/machine-api-operator/0.log" Oct 09 14:50:38 crc kubenswrapper[4902]: I1009 14:50:38.132115 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-jhttd_8ae8ae73-6077-47a8-b43e-e91ab13101e6/cert-manager-controller/0.log" Oct 09 14:50:38 crc kubenswrapper[4902]: I1009 14:50:38.325454 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xklfb_85edb63a-b99a-48b7-bdf7-285b37466b22/cert-manager-cainjector/0.log" Oct 09 14:50:38 crc kubenswrapper[4902]: I1009 14:50:38.366516 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mn6px_425830e3-71c9-4b86-86d3-3f49d61b6cab/cert-manager-webhook/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.497840 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-whhgs_b921f094-bf55-4b3e-8dd1-5f1d34a1336e/nmstate-console-plugin/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.658758 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jddjt_fcca1450-5178-488f-8ba6-b290ea61a2fb/nmstate-handler/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.742300 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-t5rg7_a526cd44-35b6-4800-bb53-fc7e1e6d96f8/kube-rbac-proxy/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.765839 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-t5rg7_a526cd44-35b6-4800-bb53-fc7e1e6d96f8/nmstate-metrics/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.927909 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-rt479_e458690a-7a6b-4b1f-92e3-a93667bf1d60/nmstate-operator/0.log" Oct 09 14:50:49 crc kubenswrapper[4902]: I1009 14:50:49.980928 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-45tq4_47ba7105-e136-4d4e-8db2-5bb2edfb5a7b/nmstate-webhook/0.log" Oct 09 14:50:50 crc kubenswrapper[4902]: I1009 14:50:50.077865 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:50:50 crc kubenswrapper[4902]: I1009 14:50:50.077955 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:50:50 crc kubenswrapper[4902]: I1009 14:50:50.078017 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:50:50 crc kubenswrapper[4902]: I1009 14:50:50.078946 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:50:50 crc kubenswrapper[4902]: I1009 14:50:50.079022 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" gracePeriod=600 Oct 09 14:50:50 crc kubenswrapper[4902]: E1009 14:50:50.203501 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:50:51 crc kubenswrapper[4902]: I1009 14:50:51.012825 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" exitCode=0 Oct 09 14:50:51 crc kubenswrapper[4902]: I1009 14:50:51.013019 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726"} Oct 09 14:50:51 crc kubenswrapper[4902]: I1009 14:50:51.013208 4902 scope.go:117] "RemoveContainer" containerID="836b83499a7687597763e91231b51d3d804da0b87b415eafc611189612815f6e" Oct 09 14:50:51 crc kubenswrapper[4902]: I1009 14:50:51.014006 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:50:51 crc kubenswrapper[4902]: E1009 14:50:51.014284 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.508102 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-m47st_bf00f4f5-2086-46a3-b460-f55dd00e2507/kube-rbac-proxy/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.649334 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-m47st_bf00f4f5-2086-46a3-b460-f55dd00e2507/controller/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.714568 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.947199 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.965771 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.966798 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:51:02 crc kubenswrapper[4902]: I1009 14:51:02.971465 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.106591 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.156685 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.157716 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.170496 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.330353 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.342753 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.355684 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.369731 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/controller/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.525212 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:51:03 crc kubenswrapper[4902]: E1009 14:51:03.526848 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.559256 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/frr-metrics/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.599945 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/kube-rbac-proxy/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.612596 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/kube-rbac-proxy-frr/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.796262 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/reloader/0.log" Oct 09 14:51:03 crc kubenswrapper[4902]: I1009 14:51:03.877199 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-v4t7k_c788e9d2-cc9c-4dd8-b65d-f422358e0510/frr-k8s-webhook-server/0.log" Oct 09 14:51:04 crc kubenswrapper[4902]: I1009 14:51:04.045657 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-69f9c58987-qjtns_04adaa94-05f3-4989-b5fa-a057f556aa56/manager/0.log" Oct 09 14:51:04 crc kubenswrapper[4902]: I1009 14:51:04.316031 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56b4cd547-vqwzj_17bd8034-bc7c-4eaa-9f47-74ca097940bd/webhook-server/0.log" Oct 09 14:51:04 crc kubenswrapper[4902]: I1009 14:51:04.405631 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pbqrp_b400d066-a3bb-4b85-aaa1-7ddca808de2e/kube-rbac-proxy/0.log" Oct 09 14:51:05 crc kubenswrapper[4902]: I1009 14:51:05.083300 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pbqrp_b400d066-a3bb-4b85-aaa1-7ddca808de2e/speaker/0.log" Oct 09 14:51:05 crc kubenswrapper[4902]: I1009 14:51:05.090507 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/frr/0.log" Oct 09 14:51:14 crc kubenswrapper[4902]: I1009 14:51:14.514113 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:51:14 crc kubenswrapper[4902]: E1009 14:51:14.515530 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.108461 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.289078 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.289263 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.312427 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.450252 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.455717 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.533914 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/extract/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.672446 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.799115 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.822444 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:51:17 crc kubenswrapper[4902]: I1009 14:51:17.849100 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.019773 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.043708 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.256693 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.511406 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/registry-server/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.525046 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.525251 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.547441 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.716490 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.734681 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:51:18 crc kubenswrapper[4902]: I1009 14:51:18.958196 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.261701 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.266211 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.266790 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.428264 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/registry-server/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.453455 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.483348 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.494772 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/extract/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.681792 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.685817 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bb67d_3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6/marketplace-operator/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.912757 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.918541 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:51:19 crc kubenswrapper[4902]: I1009 14:51:19.952378 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.110240 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.116902 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.244922 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/registry-server/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.343637 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.467089 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.481860 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.501670 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.630577 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:51:20 crc kubenswrapper[4902]: I1009 14:51:20.676135 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:51:21 crc kubenswrapper[4902]: I1009 14:51:21.183860 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/registry-server/0.log" Oct 09 14:51:29 crc kubenswrapper[4902]: I1009 14:51:29.514154 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:51:29 crc kubenswrapper[4902]: E1009 14:51:29.515357 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:51:42 crc kubenswrapper[4902]: I1009 14:51:42.186010 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mbwcp" Oct 09 14:51:43 crc kubenswrapper[4902]: I1009 14:51:43.513070 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:51:43 crc kubenswrapper[4902]: E1009 14:51:43.513367 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:51:55 crc kubenswrapper[4902]: I1009 14:51:55.513858 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:51:55 crc kubenswrapper[4902]: E1009 14:51:55.514724 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:52:07 crc kubenswrapper[4902]: I1009 14:52:07.513030 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:52:07 crc kubenswrapper[4902]: E1009 14:52:07.513832 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:52:20 crc kubenswrapper[4902]: I1009 14:52:20.513387 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:52:20 crc kubenswrapper[4902]: E1009 14:52:20.514174 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:52:35 crc kubenswrapper[4902]: I1009 14:52:35.513650 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:52:35 crc kubenswrapper[4902]: E1009 14:52:35.514604 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:52:46 crc kubenswrapper[4902]: I1009 14:52:46.512915 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:52:46 crc kubenswrapper[4902]: E1009 14:52:46.513680 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:52:57 crc kubenswrapper[4902]: I1009 14:52:57.513769 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:52:57 crc kubenswrapper[4902]: E1009 14:52:57.514635 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:53:06 crc kubenswrapper[4902]: I1009 14:53:06.245047 4902 generic.go:334] "Generic (PLEG): container finished" podID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerID="9a2e452e97ea3a3ada247e6646556cc8b52f24c04d9becac91644c41d5d5732e" exitCode=0 Oct 09 14:53:06 crc kubenswrapper[4902]: I1009 14:53:06.245260 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c6hql/must-gather-8r5mb" event={"ID":"d3672176-5fc4-4ea0-a396-1085fc7bba24","Type":"ContainerDied","Data":"9a2e452e97ea3a3ada247e6646556cc8b52f24c04d9becac91644c41d5d5732e"} Oct 09 14:53:06 crc kubenswrapper[4902]: I1009 14:53:06.248529 4902 scope.go:117] "RemoveContainer" containerID="9a2e452e97ea3a3ada247e6646556cc8b52f24c04d9becac91644c41d5d5732e" Oct 09 14:53:06 crc kubenswrapper[4902]: I1009 14:53:06.500674 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c6hql_must-gather-8r5mb_d3672176-5fc4-4ea0-a396-1085fc7bba24/gather/0.log" Oct 09 14:53:11 crc kubenswrapper[4902]: I1009 14:53:11.513026 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:53:11 crc kubenswrapper[4902]: E1009 14:53:11.515306 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:53:13 crc kubenswrapper[4902]: I1009 14:53:13.967842 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c6hql/must-gather-8r5mb"] Oct 09 14:53:13 crc kubenswrapper[4902]: I1009 14:53:13.968577 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-c6hql/must-gather-8r5mb" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="copy" containerID="cri-o://11f21813cee8040c66935837033b381ec9086a574d51b8aef2dcee49900f945a" gracePeriod=2 Oct 09 14:53:13 crc kubenswrapper[4902]: I1009 14:53:13.978674 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c6hql/must-gather-8r5mb"] Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.317831 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c6hql_must-gather-8r5mb_d3672176-5fc4-4ea0-a396-1085fc7bba24/copy/0.log" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.318937 4902 generic.go:334] "Generic (PLEG): container finished" podID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerID="11f21813cee8040c66935837033b381ec9086a574d51b8aef2dcee49900f945a" exitCode=143 Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.457071 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c6hql_must-gather-8r5mb_d3672176-5fc4-4ea0-a396-1085fc7bba24/copy/0.log" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.457419 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.483240 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgx2l\" (UniqueName: \"kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l\") pod \"d3672176-5fc4-4ea0-a396-1085fc7bba24\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.483487 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output\") pod \"d3672176-5fc4-4ea0-a396-1085fc7bba24\" (UID: \"d3672176-5fc4-4ea0-a396-1085fc7bba24\") " Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.491641 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l" (OuterVolumeSpecName: "kube-api-access-rgx2l") pod "d3672176-5fc4-4ea0-a396-1085fc7bba24" (UID: "d3672176-5fc4-4ea0-a396-1085fc7bba24"). InnerVolumeSpecName "kube-api-access-rgx2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.586319 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgx2l\" (UniqueName: \"kubernetes.io/projected/d3672176-5fc4-4ea0-a396-1085fc7bba24-kube-api-access-rgx2l\") on node \"crc\" DevicePath \"\"" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.617289 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d3672176-5fc4-4ea0-a396-1085fc7bba24" (UID: "d3672176-5fc4-4ea0-a396-1085fc7bba24"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:53:14 crc kubenswrapper[4902]: I1009 14:53:14.688708 4902 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d3672176-5fc4-4ea0-a396-1085fc7bba24-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 14:53:15 crc kubenswrapper[4902]: I1009 14:53:15.331691 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c6hql_must-gather-8r5mb_d3672176-5fc4-4ea0-a396-1085fc7bba24/copy/0.log" Oct 09 14:53:15 crc kubenswrapper[4902]: I1009 14:53:15.332541 4902 scope.go:117] "RemoveContainer" containerID="11f21813cee8040c66935837033b381ec9086a574d51b8aef2dcee49900f945a" Oct 09 14:53:15 crc kubenswrapper[4902]: I1009 14:53:15.332605 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c6hql/must-gather-8r5mb" Oct 09 14:53:15 crc kubenswrapper[4902]: I1009 14:53:15.355383 4902 scope.go:117] "RemoveContainer" containerID="9a2e452e97ea3a3ada247e6646556cc8b52f24c04d9becac91644c41d5d5732e" Oct 09 14:53:15 crc kubenswrapper[4902]: I1009 14:53:15.527094 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" path="/var/lib/kubelet/pods/d3672176-5fc4-4ea0-a396-1085fc7bba24/volumes" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.391264 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:25 crc kubenswrapper[4902]: E1009 14:53:25.392528 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="copy" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.392549 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="copy" Oct 09 14:53:25 crc kubenswrapper[4902]: E1009 14:53:25.392595 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="gather" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.392604 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="gather" Oct 09 14:53:25 crc kubenswrapper[4902]: E1009 14:53:25.392644 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdee3074-1302-4111-8c03-c22f94667569" containerName="container-00" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.392675 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdee3074-1302-4111-8c03-c22f94667569" containerName="container-00" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.392990 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdee3074-1302-4111-8c03-c22f94667569" containerName="container-00" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.393011 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="gather" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.393046 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3672176-5fc4-4ea0-a396-1085fc7bba24" containerName="copy" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.395362 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.418913 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.498203 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.498244 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.498321 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gng\" (UniqueName: \"kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.599906 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.599952 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.600089 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82gng\" (UniqueName: \"kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.601194 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.601382 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.621258 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gng\" (UniqueName: \"kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng\") pod \"certified-operators-z4j5r\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:25 crc kubenswrapper[4902]: I1009 14:53:25.718983 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:26 crc kubenswrapper[4902]: I1009 14:53:26.268138 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:26 crc kubenswrapper[4902]: I1009 14:53:26.425355 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerStarted","Data":"6c2deeaf3ce982b3bbb715de0eea1f9fde3c7ef062764ba7378192ca819a587b"} Oct 09 14:53:26 crc kubenswrapper[4902]: I1009 14:53:26.512945 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:53:26 crc kubenswrapper[4902]: E1009 14:53:26.513359 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:53:27 crc kubenswrapper[4902]: I1009 14:53:27.435077 4902 generic.go:334] "Generic (PLEG): container finished" podID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerID="35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1" exitCode=0 Oct 09 14:53:27 crc kubenswrapper[4902]: I1009 14:53:27.435155 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerDied","Data":"35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1"} Oct 09 14:53:27 crc kubenswrapper[4902]: I1009 14:53:27.445160 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:53:34 crc kubenswrapper[4902]: I1009 14:53:34.520704 4902 generic.go:334] "Generic (PLEG): container finished" podID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerID="adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad" exitCode=0 Oct 09 14:53:34 crc kubenswrapper[4902]: I1009 14:53:34.520776 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerDied","Data":"adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad"} Oct 09 14:53:35 crc kubenswrapper[4902]: I1009 14:53:35.531798 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerStarted","Data":"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3"} Oct 09 14:53:36 crc kubenswrapper[4902]: I1009 14:53:36.564143 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z4j5r" podStartSLOduration=4.115448191 podStartE2EDuration="11.564121166s" podCreationTimestamp="2025-10-09 14:53:25 +0000 UTC" firstStartedPulling="2025-10-09 14:53:27.444882222 +0000 UTC m=+3754.642741276" lastFinishedPulling="2025-10-09 14:53:34.893555167 +0000 UTC m=+3762.091414251" observedRunningTime="2025-10-09 14:53:36.563085816 +0000 UTC m=+3763.760944900" watchObservedRunningTime="2025-10-09 14:53:36.564121166 +0000 UTC m=+3763.761980230" Oct 09 14:53:39 crc kubenswrapper[4902]: I1009 14:53:39.518672 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:53:39 crc kubenswrapper[4902]: E1009 14:53:39.519855 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:53:45 crc kubenswrapper[4902]: I1009 14:53:45.719257 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:45 crc kubenswrapper[4902]: I1009 14:53:45.719840 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:45 crc kubenswrapper[4902]: I1009 14:53:45.764181 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:46 crc kubenswrapper[4902]: I1009 14:53:46.736943 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:46 crc kubenswrapper[4902]: I1009 14:53:46.783817 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:48 crc kubenswrapper[4902]: I1009 14:53:48.710907 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z4j5r" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="registry-server" containerID="cri-o://bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3" gracePeriod=2 Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.123014 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.307480 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content\") pod \"5786367d-56eb-44ef-8353-eb12ed7693b4\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.307700 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities\") pod \"5786367d-56eb-44ef-8353-eb12ed7693b4\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.307734 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82gng\" (UniqueName: \"kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng\") pod \"5786367d-56eb-44ef-8353-eb12ed7693b4\" (UID: \"5786367d-56eb-44ef-8353-eb12ed7693b4\") " Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.308852 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities" (OuterVolumeSpecName: "utilities") pod "5786367d-56eb-44ef-8353-eb12ed7693b4" (UID: "5786367d-56eb-44ef-8353-eb12ed7693b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.313639 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng" (OuterVolumeSpecName: "kube-api-access-82gng") pod "5786367d-56eb-44ef-8353-eb12ed7693b4" (UID: "5786367d-56eb-44ef-8353-eb12ed7693b4"). InnerVolumeSpecName "kube-api-access-82gng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.364057 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5786367d-56eb-44ef-8353-eb12ed7693b4" (UID: "5786367d-56eb-44ef-8353-eb12ed7693b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.409830 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.409867 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82gng\" (UniqueName: \"kubernetes.io/projected/5786367d-56eb-44ef-8353-eb12ed7693b4-kube-api-access-82gng\") on node \"crc\" DevicePath \"\"" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.409881 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5786367d-56eb-44ef-8353-eb12ed7693b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.721694 4902 generic.go:334] "Generic (PLEG): container finished" podID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerID="bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3" exitCode=0 Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.721742 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerDied","Data":"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3"} Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.721774 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z4j5r" event={"ID":"5786367d-56eb-44ef-8353-eb12ed7693b4","Type":"ContainerDied","Data":"6c2deeaf3ce982b3bbb715de0eea1f9fde3c7ef062764ba7378192ca819a587b"} Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.721792 4902 scope.go:117] "RemoveContainer" containerID="bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.722577 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z4j5r" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.748656 4902 scope.go:117] "RemoveContainer" containerID="adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.752524 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.761821 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z4j5r"] Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.768448 4902 scope.go:117] "RemoveContainer" containerID="35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.813974 4902 scope.go:117] "RemoveContainer" containerID="bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3" Oct 09 14:53:49 crc kubenswrapper[4902]: E1009 14:53:49.814489 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3\": container with ID starting with bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3 not found: ID does not exist" containerID="bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.814548 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3"} err="failed to get container status \"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3\": rpc error: code = NotFound desc = could not find container \"bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3\": container with ID starting with bc769006fc7120df401fcc08332e5e66aa30128c0e40f89791e3c91949e9e1a3 not found: ID does not exist" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.814586 4902 scope.go:117] "RemoveContainer" containerID="adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad" Oct 09 14:53:49 crc kubenswrapper[4902]: E1009 14:53:49.814926 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad\": container with ID starting with adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad not found: ID does not exist" containerID="adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.814969 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad"} err="failed to get container status \"adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad\": rpc error: code = NotFound desc = could not find container \"adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad\": container with ID starting with adeea1433bbe64796acc244030e3939fd36b467e7eab92460cf9c1d716ae71ad not found: ID does not exist" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.815006 4902 scope.go:117] "RemoveContainer" containerID="35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1" Oct 09 14:53:49 crc kubenswrapper[4902]: E1009 14:53:49.815310 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1\": container with ID starting with 35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1 not found: ID does not exist" containerID="35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1" Oct 09 14:53:49 crc kubenswrapper[4902]: I1009 14:53:49.815333 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1"} err="failed to get container status \"35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1\": rpc error: code = NotFound desc = could not find container \"35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1\": container with ID starting with 35772fb44f17b34aa04c80074058cd71b138e99201be3554852b7dc0a1db3ec1 not found: ID does not exist" Oct 09 14:53:51 crc kubenswrapper[4902]: I1009 14:53:51.524926 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" path="/var/lib/kubelet/pods/5786367d-56eb-44ef-8353-eb12ed7693b4/volumes" Oct 09 14:53:53 crc kubenswrapper[4902]: I1009 14:53:53.521746 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:53:53 crc kubenswrapper[4902]: E1009 14:53:53.522435 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.154618 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgsr/must-gather-pr2cj"] Oct 09 14:53:57 crc kubenswrapper[4902]: E1009 14:53:57.155762 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="extract-utilities" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.155780 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="extract-utilities" Oct 09 14:53:57 crc kubenswrapper[4902]: E1009 14:53:57.155800 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="registry-server" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.155808 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="registry-server" Oct 09 14:53:57 crc kubenswrapper[4902]: E1009 14:53:57.155859 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="extract-content" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.155869 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="extract-content" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.156123 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="5786367d-56eb-44ef-8353-eb12ed7693b4" containerName="registry-server" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.157763 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.159670 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgsr"/"kube-root-ca.crt" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.160963 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-glgsr"/"openshift-service-ca.crt" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.184191 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgsr/must-gather-pr2cj"] Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.247547 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.247741 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc7m\" (UniqueName: \"kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.349938 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.350169 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc7m\" (UniqueName: \"kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.350553 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.382146 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc7m\" (UniqueName: \"kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m\") pod \"must-gather-pr2cj\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.481659 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:53:57 crc kubenswrapper[4902]: I1009 14:53:57.976023 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-glgsr/must-gather-pr2cj"] Oct 09 14:53:58 crc kubenswrapper[4902]: I1009 14:53:58.817928 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/must-gather-pr2cj" event={"ID":"09f92dfe-37ba-4c41-afbd-15a01e30b414","Type":"ContainerStarted","Data":"c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07"} Oct 09 14:53:58 crc kubenswrapper[4902]: I1009 14:53:58.818498 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/must-gather-pr2cj" event={"ID":"09f92dfe-37ba-4c41-afbd-15a01e30b414","Type":"ContainerStarted","Data":"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18"} Oct 09 14:53:58 crc kubenswrapper[4902]: I1009 14:53:58.818513 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/must-gather-pr2cj" event={"ID":"09f92dfe-37ba-4c41-afbd-15a01e30b414","Type":"ContainerStarted","Data":"a82b0ac57c5a848e4a71a926fe138f2cbacea2c0b66afd2b48fc08f9107a28db"} Oct 09 14:53:58 crc kubenswrapper[4902]: I1009 14:53:58.838631 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-glgsr/must-gather-pr2cj" podStartSLOduration=1.838607909 podStartE2EDuration="1.838607909s" podCreationTimestamp="2025-10-09 14:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:53:58.832314737 +0000 UTC m=+3786.030173821" watchObservedRunningTime="2025-10-09 14:53:58.838607909 +0000 UTC m=+3786.036466993" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.623887 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgsr/crc-debug-z8nt7"] Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.625632 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.627538 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-glgsr"/"default-dockercfg-lwzh6" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.648626 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mn7l\" (UniqueName: \"kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.648744 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.750049 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mn7l\" (UniqueName: \"kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.750212 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.750440 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.772687 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mn7l\" (UniqueName: \"kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l\") pod \"crc-debug-z8nt7\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:01 crc kubenswrapper[4902]: I1009 14:54:01.955079 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:02 crc kubenswrapper[4902]: I1009 14:54:02.858372 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" event={"ID":"21cf7d01-13ae-4700-ae88-54f4b915b034","Type":"ContainerStarted","Data":"76a653fbdf5769aa7555e9c2c5f57c5e4bc8507fc9c4e73ceae012a1d7003388"} Oct 09 14:54:02 crc kubenswrapper[4902]: I1009 14:54:02.858942 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" event={"ID":"21cf7d01-13ae-4700-ae88-54f4b915b034","Type":"ContainerStarted","Data":"82dcb9e33eb9fc6c95e94bdd94452b43ce239f835d2ae6dda4ac60f14471c7cb"} Oct 09 14:54:02 crc kubenswrapper[4902]: I1009 14:54:02.878757 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" podStartSLOduration=1.87873885 podStartE2EDuration="1.87873885s" podCreationTimestamp="2025-10-09 14:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:54:02.874664202 +0000 UTC m=+3790.072523286" watchObservedRunningTime="2025-10-09 14:54:02.87873885 +0000 UTC m=+3790.076597914" Oct 09 14:54:07 crc kubenswrapper[4902]: I1009 14:54:07.513522 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:54:07 crc kubenswrapper[4902]: E1009 14:54:07.514550 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:54:22 crc kubenswrapper[4902]: I1009 14:54:22.512689 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:54:22 crc kubenswrapper[4902]: E1009 14:54:22.513472 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:54:33 crc kubenswrapper[4902]: I1009 14:54:33.539098 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:54:33 crc kubenswrapper[4902]: E1009 14:54:33.540316 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:54:37 crc kubenswrapper[4902]: I1009 14:54:37.158299 4902 generic.go:334] "Generic (PLEG): container finished" podID="21cf7d01-13ae-4700-ae88-54f4b915b034" containerID="76a653fbdf5769aa7555e9c2c5f57c5e4bc8507fc9c4e73ceae012a1d7003388" exitCode=0 Oct 09 14:54:37 crc kubenswrapper[4902]: I1009 14:54:37.158493 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" event={"ID":"21cf7d01-13ae-4700-ae88-54f4b915b034","Type":"ContainerDied","Data":"76a653fbdf5769aa7555e9c2c5f57c5e4bc8507fc9c4e73ceae012a1d7003388"} Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.285551 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.319168 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-z8nt7"] Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.332367 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-z8nt7"] Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.421564 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host\") pod \"21cf7d01-13ae-4700-ae88-54f4b915b034\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.421724 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host" (OuterVolumeSpecName: "host") pod "21cf7d01-13ae-4700-ae88-54f4b915b034" (UID: "21cf7d01-13ae-4700-ae88-54f4b915b034"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.422174 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mn7l\" (UniqueName: \"kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l\") pod \"21cf7d01-13ae-4700-ae88-54f4b915b034\" (UID: \"21cf7d01-13ae-4700-ae88-54f4b915b034\") " Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.422888 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/21cf7d01-13ae-4700-ae88-54f4b915b034-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.427589 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l" (OuterVolumeSpecName: "kube-api-access-7mn7l") pod "21cf7d01-13ae-4700-ae88-54f4b915b034" (UID: "21cf7d01-13ae-4700-ae88-54f4b915b034"). InnerVolumeSpecName "kube-api-access-7mn7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:54:38 crc kubenswrapper[4902]: I1009 14:54:38.525369 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mn7l\" (UniqueName: \"kubernetes.io/projected/21cf7d01-13ae-4700-ae88-54f4b915b034-kube-api-access-7mn7l\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.186899 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82dcb9e33eb9fc6c95e94bdd94452b43ce239f835d2ae6dda4ac60f14471c7cb" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.186996 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-z8nt7" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.511638 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgsr/crc-debug-2548v"] Oct 09 14:54:39 crc kubenswrapper[4902]: E1009 14:54:39.512276 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cf7d01-13ae-4700-ae88-54f4b915b034" containerName="container-00" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.512291 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cf7d01-13ae-4700-ae88-54f4b915b034" containerName="container-00" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.512589 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cf7d01-13ae-4700-ae88-54f4b915b034" containerName="container-00" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.516716 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.518757 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-glgsr"/"default-dockercfg-lwzh6" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.536376 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cf7d01-13ae-4700-ae88-54f4b915b034" path="/var/lib/kubelet/pods/21cf7d01-13ae-4700-ae88-54f4b915b034/volumes" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.646348 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.646472 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjzl\" (UniqueName: \"kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.748724 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjzl\" (UniqueName: \"kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.749121 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.749268 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.777702 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjzl\" (UniqueName: \"kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl\") pod \"crc-debug-2548v\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:39 crc kubenswrapper[4902]: I1009 14:54:39.834680 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:40 crc kubenswrapper[4902]: I1009 14:54:40.200616 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-2548v" event={"ID":"70a8f5f9-b7ca-48bb-aa03-4e931231a228","Type":"ContainerStarted","Data":"75ea4df73c553410f481959b921d3f1f56f5e5e62a5db0a8579bbc9d604f5486"} Oct 09 14:54:40 crc kubenswrapper[4902]: I1009 14:54:40.200976 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-2548v" event={"ID":"70a8f5f9-b7ca-48bb-aa03-4e931231a228","Type":"ContainerStarted","Data":"7299d62e61869f8807dd7f93d1ebcf4f99d600b11a0aa6b1993bcb11e6e84465"} Oct 09 14:54:40 crc kubenswrapper[4902]: I1009 14:54:40.219816 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-glgsr/crc-debug-2548v" podStartSLOduration=1.219792959 podStartE2EDuration="1.219792959s" podCreationTimestamp="2025-10-09 14:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 14:54:40.212793277 +0000 UTC m=+3827.410652341" watchObservedRunningTime="2025-10-09 14:54:40.219792959 +0000 UTC m=+3827.417652023" Oct 09 14:54:41 crc kubenswrapper[4902]: I1009 14:54:41.210798 4902 generic.go:334] "Generic (PLEG): container finished" podID="70a8f5f9-b7ca-48bb-aa03-4e931231a228" containerID="75ea4df73c553410f481959b921d3f1f56f5e5e62a5db0a8579bbc9d604f5486" exitCode=0 Oct 09 14:54:41 crc kubenswrapper[4902]: I1009 14:54:41.210853 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-2548v" event={"ID":"70a8f5f9-b7ca-48bb-aa03-4e931231a228","Type":"ContainerDied","Data":"75ea4df73c553410f481959b921d3f1f56f5e5e62a5db0a8579bbc9d604f5486"} Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.328066 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.359657 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-2548v"] Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.368199 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-2548v"] Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.395518 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnjzl\" (UniqueName: \"kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl\") pod \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.395635 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host\") pod \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\" (UID: \"70a8f5f9-b7ca-48bb-aa03-4e931231a228\") " Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.395710 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host" (OuterVolumeSpecName: "host") pod "70a8f5f9-b7ca-48bb-aa03-4e931231a228" (UID: "70a8f5f9-b7ca-48bb-aa03-4e931231a228"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.396196 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70a8f5f9-b7ca-48bb-aa03-4e931231a228-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.401792 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl" (OuterVolumeSpecName: "kube-api-access-dnjzl") pod "70a8f5f9-b7ca-48bb-aa03-4e931231a228" (UID: "70a8f5f9-b7ca-48bb-aa03-4e931231a228"). InnerVolumeSpecName "kube-api-access-dnjzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:54:42 crc kubenswrapper[4902]: I1009 14:54:42.498220 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnjzl\" (UniqueName: \"kubernetes.io/projected/70a8f5f9-b7ca-48bb-aa03-4e931231a228-kube-api-access-dnjzl\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.231661 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7299d62e61869f8807dd7f93d1ebcf4f99d600b11a0aa6b1993bcb11e6e84465" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.231715 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-2548v" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.528710 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a8f5f9-b7ca-48bb-aa03-4e931231a228" path="/var/lib/kubelet/pods/70a8f5f9-b7ca-48bb-aa03-4e931231a228/volumes" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.529444 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-glgsr/crc-debug-vjrt5"] Oct 09 14:54:43 crc kubenswrapper[4902]: E1009 14:54:43.529735 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a8f5f9-b7ca-48bb-aa03-4e931231a228" containerName="container-00" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.529753 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a8f5f9-b7ca-48bb-aa03-4e931231a228" containerName="container-00" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.529943 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a8f5f9-b7ca-48bb-aa03-4e931231a228" containerName="container-00" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.530800 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.533455 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-glgsr"/"default-dockercfg-lwzh6" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.620253 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.620361 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5b7\" (UniqueName: \"kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.726209 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.726465 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5b7\" (UniqueName: \"kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.727108 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.745367 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5b7\" (UniqueName: \"kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7\") pod \"crc-debug-vjrt5\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: I1009 14:54:43.858148 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:43 crc kubenswrapper[4902]: W1009 14:54:43.894760 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode18caf89_138a_4b99_986f_32e57e94bb90.slice/crio-97a31932fb984592bdd9932e06faf6af77e323cb6735f80121878143a94615ff WatchSource:0}: Error finding container 97a31932fb984592bdd9932e06faf6af77e323cb6735f80121878143a94615ff: Status 404 returned error can't find the container with id 97a31932fb984592bdd9932e06faf6af77e323cb6735f80121878143a94615ff Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.244119 4902 generic.go:334] "Generic (PLEG): container finished" podID="e18caf89-138a-4b99-986f-32e57e94bb90" containerID="50f9110e4567b4dae26f16367413c83fc5bbe2ef2c03b36209214a6bde5828fd" exitCode=0 Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.244188 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" event={"ID":"e18caf89-138a-4b99-986f-32e57e94bb90","Type":"ContainerDied","Data":"50f9110e4567b4dae26f16367413c83fc5bbe2ef2c03b36209214a6bde5828fd"} Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.244276 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" event={"ID":"e18caf89-138a-4b99-986f-32e57e94bb90","Type":"ContainerStarted","Data":"97a31932fb984592bdd9932e06faf6af77e323cb6735f80121878143a94615ff"} Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.283774 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-vjrt5"] Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.292582 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgsr/crc-debug-vjrt5"] Oct 09 14:54:44 crc kubenswrapper[4902]: I1009 14:54:44.512975 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:54:44 crc kubenswrapper[4902]: E1009 14:54:44.513330 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.368683 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.462079 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host\") pod \"e18caf89-138a-4b99-986f-32e57e94bb90\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.462583 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5b7\" (UniqueName: \"kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7\") pod \"e18caf89-138a-4b99-986f-32e57e94bb90\" (UID: \"e18caf89-138a-4b99-986f-32e57e94bb90\") " Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.462999 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host" (OuterVolumeSpecName: "host") pod "e18caf89-138a-4b99-986f-32e57e94bb90" (UID: "e18caf89-138a-4b99-986f-32e57e94bb90"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.463446 4902 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e18caf89-138a-4b99-986f-32e57e94bb90-host\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.474438 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7" (OuterVolumeSpecName: "kube-api-access-xb5b7") pod "e18caf89-138a-4b99-986f-32e57e94bb90" (UID: "e18caf89-138a-4b99-986f-32e57e94bb90"). InnerVolumeSpecName "kube-api-access-xb5b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.532285 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18caf89-138a-4b99-986f-32e57e94bb90" path="/var/lib/kubelet/pods/e18caf89-138a-4b99-986f-32e57e94bb90/volumes" Oct 09 14:54:45 crc kubenswrapper[4902]: I1009 14:54:45.565565 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5b7\" (UniqueName: \"kubernetes.io/projected/e18caf89-138a-4b99-986f-32e57e94bb90-kube-api-access-xb5b7\") on node \"crc\" DevicePath \"\"" Oct 09 14:54:46 crc kubenswrapper[4902]: I1009 14:54:46.263326 4902 scope.go:117] "RemoveContainer" containerID="50f9110e4567b4dae26f16367413c83fc5bbe2ef2c03b36209214a6bde5828fd" Oct 09 14:54:46 crc kubenswrapper[4902]: I1009 14:54:46.263533 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/crc-debug-vjrt5" Oct 09 14:54:56 crc kubenswrapper[4902]: I1009 14:54:56.513215 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:54:56 crc kubenswrapper[4902]: E1009 14:54:56.514002 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:54:57 crc kubenswrapper[4902]: I1009 14:54:57.910102 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54f57498cd-cv95r_8e22729c-3eef-405e-bf5a-5654f9795d57/barbican-api/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.033284 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54f57498cd-cv95r_8e22729c-3eef-405e-bf5a-5654f9795d57/barbican-api-log/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.127759 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbd9b6574-b5dht_11b3f7c7-66a5-485c-922e-b5568e2f9f1c/barbican-keystone-listener/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.132565 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7dbd9b6574-b5dht_11b3f7c7-66a5-485c-922e-b5568e2f9f1c/barbican-keystone-listener-log/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.326248 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-896cb696f-kkg85_fe538ee2-2e8c-406f-8e70-bc56325ec408/barbican-worker/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.344565 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-896cb696f-kkg85_fe538ee2-2e8c-406f-8e70-bc56325ec408/barbican-worker-log/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.473967 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7x26n_705cf92b-1b0d-4706-bf30-03fb1a9728cd/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.562449 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/ceilometer-notification-agent/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.596701 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/ceilometer-central-agent/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.669954 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/proxy-httpd/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.740277 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a4dab6bb-60d0-4984-91e7-2013f341a39d/sg-core/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.842131 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d7c2affc-5952-43d0-8629-8e61961bdf1c/cinder-api/0.log" Oct 09 14:54:58 crc kubenswrapper[4902]: I1009 14:54:58.889383 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d7c2affc-5952-43d0-8629-8e61961bdf1c/cinder-api-log/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.054651 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_676825a2-3e5b-4137-b9fc-337425ff8d09/probe/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.088171 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_676825a2-3e5b-4137-b9fc-337425ff8d09/cinder-scheduler/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.269302 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-w7kbg_18dcb9ba-f068-421a-a11c-f25f2b7c940a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.285278 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9bdh4_7b94bcc5-6ba7-44a6-aca8-ffaa0b52e89e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.450850 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-f9f9h_77b43a36-9858-4efc-aa2b-a56278710389/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.472471 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/init/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.653975 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/init/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.663914 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-8b4gd_bdf337ce-e7d5-4de9-acb8-a98a481a8ab3/dnsmasq-dns/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.759746 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tbqw8_f53eb372-afb1-4f71-b4b3-eb4b36483e5e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.849256 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c0515fd-b685-4dab-909a-3f4147e19a59/glance-httpd/0.log" Oct 09 14:54:59 crc kubenswrapper[4902]: I1009 14:54:59.898210 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_9c0515fd-b685-4dab-909a-3f4147e19a59/glance-log/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.048256 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df90d442-7261-4353-821c-c0e71a43998a/glance-httpd/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.085261 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df90d442-7261-4353-821c-c0e71a43998a/glance-log/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.249042 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-779d95f9fb-tfjvq_40e0f94d-30a4-456b-bfd4-7da1453facc4/horizon/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.356257 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gp9mr_813f74d2-a7a6-4e97-983b-544c38995262/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.609850 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lsz7p_9dd56ad2-b36d-4850-81d2-b06db395ecd6/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.666418 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-779d95f9fb-tfjvq_40e0f94d-30a4-456b-bfd4-7da1453facc4/horizon-log/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.825362 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_dde7a697-a373-4e6a-8535-d7768f569e18/kube-state-metrics/0.log" Oct 09 14:55:00 crc kubenswrapper[4902]: I1009 14:55:00.894463 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d55545c5f-lff8v_b32b124f-e090-4e6c-b2b7-138e1059b680/keystone-api/0.log" Oct 09 14:55:01 crc kubenswrapper[4902]: I1009 14:55:01.048326 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-tpxhw_99af5091-b31f-45c0-abcf-882b0159219f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:01 crc kubenswrapper[4902]: I1009 14:55:01.423214 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65878bc9b7-hv97v_b17f63fc-0163-416e-a3ee-179a1a071560/neutron-httpd/0.log" Oct 09 14:55:01 crc kubenswrapper[4902]: I1009 14:55:01.460951 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-65878bc9b7-hv97v_b17f63fc-0163-416e-a3ee-179a1a071560/neutron-api/0.log" Oct 09 14:55:01 crc kubenswrapper[4902]: I1009 14:55:01.632753 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zb49n_1c9a61d2-e7f4-4e22-8b3b-18263b72df09/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.217739 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_61f63f9a-89ab-45fa-b62f-e93f826423a9/nova-cell0-conductor-conductor/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.243072 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7a597676-c413-4919-a79b-ac49dd2671c2/nova-api-log/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.482043 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7a597676-c413-4919-a79b-ac49dd2671c2/nova-api-api/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.590541 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3fe652a7-7ee4-4a55-9f17-a359b82df106/nova-cell1-conductor-conductor/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.619115 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_2bc33878-b734-4520-b5e0-e066f53dbe31/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.805242 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-f9m9b_a5fc156b-09f2-4647-a2df-73877fb9db6f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:02 crc kubenswrapper[4902]: I1009 14:55:02.940219 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a272cc22-f1dc-48b7-89ef-e4578877aa78/nova-metadata-log/0.log" Oct 09 14:55:03 crc kubenswrapper[4902]: I1009 14:55:03.341511 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/mysql-bootstrap/0.log" Oct 09 14:55:03 crc kubenswrapper[4902]: I1009 14:55:03.365248 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b1206eaa-959a-4a3d-8c35-a60fc09bb3d5/nova-scheduler-scheduler/0.log" Oct 09 14:55:03 crc kubenswrapper[4902]: I1009 14:55:03.534845 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/mysql-bootstrap/0.log" Oct 09 14:55:03 crc kubenswrapper[4902]: I1009 14:55:03.542313 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4dec46d-073a-484c-ba80-0ff939025e48/galera/0.log" Oct 09 14:55:03 crc kubenswrapper[4902]: I1009 14:55:03.752013 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/mysql-bootstrap/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.027998 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/mysql-bootstrap/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.052610 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4bb5cfb3-cfb7-4408-bbe1-3bf59b44614e/galera/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.228244 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_594a9127-c741-4bb1-871f-0295abab43ce/openstackclient/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.319109 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-79djm_7f0722a8-eee2-4bb1-a3b4-d14964d35227/ovn-controller/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.497206 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a272cc22-f1dc-48b7-89ef-e4578877aa78/nova-metadata-metadata/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.552377 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pm8hp_aa516b5f-aa20-4f8a-bf1c-3bba6bb72ffb/openstack-network-exporter/0.log" Oct 09 14:55:04 crc kubenswrapper[4902]: I1009 14:55:04.836341 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server-init/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.028002 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server-init/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.055859 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovs-vswitchd/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.093178 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rlj4x_007b48e7-2e7a-45e6-bc70-1c86a275d808/ovsdb-server/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.272312 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-ndvq8_9478bb2f-ce46-41f9-bfbd-e93ebcb437ed/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.298828 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22ef2931-973d-462a-ae3a-d05056c72468/openstack-network-exporter/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.299362 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22ef2931-973d-462a-ae3a-d05056c72468/ovn-northd/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.473344 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea130bce-ed3c-495f-b06b-14278e3133ca/openstack-network-exporter/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.498710 4902 scope.go:117] "RemoveContainer" containerID="ef619fa93af693ecf506e8c47d6188b080647d6ce4f0e93b121b8c375c96c4dd" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.563544 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ea130bce-ed3c-495f-b06b-14278e3133ca/ovsdbserver-nb/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.757687 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1676b099-df2c-477b-a05b-b46d47dc3b05/openstack-network-exporter/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.769876 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1676b099-df2c-477b-a05b-b46d47dc3b05/ovsdbserver-sb/0.log" Oct 09 14:55:05 crc kubenswrapper[4902]: I1009 14:55:05.959251 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-864577bbd8-z8v7t_9e8ddce2-e9c0-4dc6-8bcd-d228188630dc/placement-api/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.075622 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-864577bbd8-z8v7t_9e8ddce2-e9c0-4dc6-8bcd-d228188630dc/placement-log/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.116779 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/setup-container/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.263215 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/setup-container/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.304570 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ef20e2e8-fcf0-438a-80a3-fd50db544b6e/rabbitmq/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.354134 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/setup-container/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.557705 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rdp4z_f3d3271d-29ab-4339-8614-a297a2b8791f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.578773 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/setup-container/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.637671 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f0f31142-f615-421c-a863-1603f1cb31a0/rabbitmq/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.848923 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ghc9w_2cfa6eb9-6c46-4420-a165-d7a1c4d7713a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:06 crc kubenswrapper[4902]: I1009 14:55:06.878749 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-h4wt2_dc688ac8-1f96-4a97-adf2-151b28cca357/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.042706 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-dss8m_21ce0f21-ea6d-4bed-b68b-2573a40c5443/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.205307 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-q5kpf_582550a8-6e34-4a07-97af-70e3770fedcd/ssh-known-hosts-edpm-deployment/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.409221 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6568f7cff-cv7qx_797c027a-6081-4aa8-9643-ddffc4393193/proxy-server/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.481825 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6568f7cff-cv7qx_797c027a-6081-4aa8-9643-ddffc4393193/proxy-httpd/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.557910 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-khn8g_014a8355-9817-424e-ae75-b786043b2a4c/swift-ring-rebalance/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.651971 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-auditor/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.776702 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-reaper/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.815318 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-replicator/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.880629 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/account-server/0.log" Oct 09 14:55:07 crc kubenswrapper[4902]: I1009 14:55:07.928775 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-auditor/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.018372 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-server/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.060547 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-replicator/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.090548 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/container-updater/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.186885 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-auditor/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.289864 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-expirer/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.314015 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-replicator/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.329842 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-server/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.478286 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/object-updater/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.493133 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/rsync/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.534838 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fa80b7ed-e420-455b-a918-d474c0453547/swift-recon-cron/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.836368 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-p8rnm_40011150-b1be-4ddc-8ecf-b70c54c98b9c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:08 crc kubenswrapper[4902]: I1009 14:55:08.847918 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ee0ede17-c9e7-40c7-b2da-ac04b4df9010/tempest-tests-tempest-tests-runner/0.log" Oct 09 14:55:09 crc kubenswrapper[4902]: I1009 14:55:09.092336 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b2bc8eb5-06bc-4813-86fd-e96c9f53fd94/test-operator-logs-container/0.log" Oct 09 14:55:09 crc kubenswrapper[4902]: I1009 14:55:09.106611 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-q62lf_17bda176-986b-468d-b839-a58df9d3cf58/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 14:55:09 crc kubenswrapper[4902]: I1009 14:55:09.518860 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:55:09 crc kubenswrapper[4902]: E1009 14:55:09.519162 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:55:19 crc kubenswrapper[4902]: I1009 14:55:19.705961 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_590e6023-7dbe-499f-a8ea-4b8c3e24f747/memcached/0.log" Oct 09 14:55:24 crc kubenswrapper[4902]: I1009 14:55:24.513819 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:55:24 crc kubenswrapper[4902]: E1009 14:55:24.514546 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:55:31 crc kubenswrapper[4902]: I1009 14:55:31.991199 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-r4nn8_ac925db8-cb97-468e-b43f-b219deb78cf6/kube-rbac-proxy/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.099616 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-r4nn8_ac925db8-cb97-468e-b43f-b219deb78cf6/manager/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.243494 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-hrgs4_a1fcc021-b92b-417d-b92c-4e66386e8502/kube-rbac-proxy/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.301747 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-hrgs4_a1fcc021-b92b-417d-b92c-4e66386e8502/manager/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.378464 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-pxsfz_c27e1a63-1155-43eb-9c97-61680f083de0/kube-rbac-proxy/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.441529 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-pxsfz_c27e1a63-1155-43eb-9c97-61680f083de0/manager/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.535532 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.697949 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.703860 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.711870 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.863498 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/util/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.884310 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/extract/0.log" Oct 09 14:55:32 crc kubenswrapper[4902]: I1009 14:55:32.896579 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f3be4bfbef7f7c1d153ec042185ac5a6497a28f6f13f17f0b43e1db33258rfq_4b9e579a-5f0a-4d30-8792-fd71075c1479/pull/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.054865 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-2nkqk_6169ab22-9b0b-4bb3-b840-b3eb92d22c0c/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.123215 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-7bjrw_960aab4c-ce86-4753-b848-3367f15d962c/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.147499 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-2nkqk_6169ab22-9b0b-4bb3-b840-b3eb92d22c0c/manager/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.251787 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-7bjrw_960aab4c-ce86-4753-b848-3367f15d962c/manager/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.329853 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-phbbt_39e518d9-bffd-4421-bc8b-2b333654ff9e/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.384163 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-phbbt_39e518d9-bffd-4421-bc8b-2b333654ff9e/manager/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.500392 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6xw4k_92649c6e-71ba-4945-9210-19394d180222/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.661245 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-6xw4k_92649c6e-71ba-4945-9210-19394d180222/manager/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.699655 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-ql5w7_dad70f9e-3fb4-41ff-95f4-dc6be5277aa0/manager/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.699739 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-ql5w7_dad70f9e-3fb4-41ff-95f4-dc6be5277aa0/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.855633 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7bzzg_d11828c7-488d-414a-a024-68a46fca78e1/kube-rbac-proxy/0.log" Oct 09 14:55:33 crc kubenswrapper[4902]: I1009 14:55:33.929121 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7bzzg_d11828c7-488d-414a-a024-68a46fca78e1/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.063760 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-kvzf6_4ad4e07d-4f69-4f1f-9886-bde91ec3b735/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.080876 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-kvzf6_4ad4e07d-4f69-4f1f-9886-bde91ec3b735/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.145335 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-hxrth_07f5d370-e69d-41f8-b65a-d25dc8b38de8/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.262255 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-hxrth_07f5d370-e69d-41f8-b65a-d25dc8b38de8/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.296724 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-ld6xx_0dafd5d3-f605-4c75-86ca-8d40831e9cb7/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.380190 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-ld6xx_0dafd5d3-f605-4c75-86ca-8d40831e9cb7/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.489023 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-s7bkj_5be11d13-4feb-4a12-9f9b-69a99d2fa5a4/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.584818 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-s7bkj_5be11d13-4feb-4a12-9f9b-69a99d2fa5a4/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.677152 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-mqn8s_cad34d91-d544-4311-a9b3-adb11e4217c0/manager/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.752566 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-mqn8s_cad34d91-d544-4311-a9b3-adb11e4217c0/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.841209 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757db22ds_d5375300-657d-4e1d-92af-2107cbc7972f/kube-rbac-proxy/0.log" Oct 09 14:55:34 crc kubenswrapper[4902]: I1009 14:55:34.881767 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757db22ds_d5375300-657d-4e1d-92af-2107cbc7972f/manager/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.036732 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d5f574b49-xxs9l_cf6f1e72-9e96-4905-a7f4-d88ec796724e/kube-rbac-proxy/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.169947 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-647744f6c-bzqqk_994ae404-6c3b-499c-b51d-e5a0eea83756/kube-rbac-proxy/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.460479 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-647744f6c-bzqqk_994ae404-6c3b-499c-b51d-e5a0eea83756/operator/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.474891 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dv7t7_468c32be-1138-4600-bcd2-85aa8b02ec69/registry-server/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.694580 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-dwgtb_3adf1a7b-f2b7-4927-a026-55afe09bc5ab/kube-rbac-proxy/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.765128 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-dwgtb_3adf1a7b-f2b7-4927-a026-55afe09bc5ab/manager/0.log" Oct 09 14:55:35 crc kubenswrapper[4902]: I1009 14:55:35.952253 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-s6zmh_52ec0675-fab2-43fd-a447-8896de9e78fd/kube-rbac-proxy/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.031634 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-s6zmh_52ec0675-fab2-43fd-a447-8896de9e78fd/manager/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.211211 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-8jw5l_93ae4a6d-1e42-4e45-8128-61088861873e/operator/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.337262 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d5f574b49-xxs9l_cf6f1e72-9e96-4905-a7f4-d88ec796724e/manager/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.352129 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-xlwpm_d972447c-10cf-4d4b-870d-11e79f6bd98a/kube-rbac-proxy/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.481853 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-xlwpm_d972447c-10cf-4d4b-870d-11e79f6bd98a/manager/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.507064 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-hn9gv_55efad12-5eb1-4c57-bb2f-700ead538209/kube-rbac-proxy/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.632223 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-7lct5_183bbfe9-141b-4b7a-adc1-3ea01011ebd7/kube-rbac-proxy/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.693606 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-hn9gv_55efad12-5eb1-4c57-bb2f-700ead538209/manager/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.817151 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-7lct5_183bbfe9-141b-4b7a-adc1-3ea01011ebd7/manager/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.839819 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-9lx7z_be1ef3d2-4c04-4040-9d73-80655f4b9dbb/kube-rbac-proxy/0.log" Oct 09 14:55:36 crc kubenswrapper[4902]: I1009 14:55:36.894190 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-9lx7z_be1ef3d2-4c04-4040-9d73-80655f4b9dbb/manager/0.log" Oct 09 14:55:38 crc kubenswrapper[4902]: I1009 14:55:38.513065 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:55:38 crc kubenswrapper[4902]: E1009 14:55:38.513688 4902 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gbt7s_openshift-machine-config-operator(6cfbac91-e798-4e5e-9f3c-f454ea6f457e)\"" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" Oct 09 14:55:51 crc kubenswrapper[4902]: I1009 14:55:51.778424 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pqbtl_7ae518f0-243e-4916-89cb-0e621793d4db/control-plane-machine-set-operator/0.log" Oct 09 14:55:51 crc kubenswrapper[4902]: I1009 14:55:51.940658 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bszj2_6f99a811-543c-4b99-a394-9d941401efff/kube-rbac-proxy/0.log" Oct 09 14:55:51 crc kubenswrapper[4902]: I1009 14:55:51.969362 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bszj2_6f99a811-543c-4b99-a394-9d941401efff/machine-api-operator/0.log" Oct 09 14:55:52 crc kubenswrapper[4902]: I1009 14:55:52.513047 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 14:55:52 crc kubenswrapper[4902]: I1009 14:55:52.843500 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"dbab6b9066d83619bdeb19ac785cb63ac0c1e5fe99b941a6fef8c2cd526a174d"} Oct 09 14:56:03 crc kubenswrapper[4902]: I1009 14:56:03.352339 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-jhttd_8ae8ae73-6077-47a8-b43e-e91ab13101e6/cert-manager-controller/0.log" Oct 09 14:56:03 crc kubenswrapper[4902]: I1009 14:56:03.538721 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xklfb_85edb63a-b99a-48b7-bdf7-285b37466b22/cert-manager-cainjector/0.log" Oct 09 14:56:03 crc kubenswrapper[4902]: I1009 14:56:03.566765 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mn6px_425830e3-71c9-4b86-86d3-3f49d61b6cab/cert-manager-webhook/0.log" Oct 09 14:56:14 crc kubenswrapper[4902]: I1009 14:56:14.661583 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-whhgs_b921f094-bf55-4b3e-8dd1-5f1d34a1336e/nmstate-console-plugin/0.log" Oct 09 14:56:14 crc kubenswrapper[4902]: I1009 14:56:14.852583 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jddjt_fcca1450-5178-488f-8ba6-b290ea61a2fb/nmstate-handler/0.log" Oct 09 14:56:14 crc kubenswrapper[4902]: I1009 14:56:14.915140 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-t5rg7_a526cd44-35b6-4800-bb53-fc7e1e6d96f8/kube-rbac-proxy/0.log" Oct 09 14:56:15 crc kubenswrapper[4902]: I1009 14:56:15.022704 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-t5rg7_a526cd44-35b6-4800-bb53-fc7e1e6d96f8/nmstate-metrics/0.log" Oct 09 14:56:15 crc kubenswrapper[4902]: I1009 14:56:15.136098 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-rt479_e458690a-7a6b-4b1f-92e3-a93667bf1d60/nmstate-operator/0.log" Oct 09 14:56:15 crc kubenswrapper[4902]: I1009 14:56:15.252964 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-45tq4_47ba7105-e136-4d4e-8db2-5bb2edfb5a7b/nmstate-webhook/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.136821 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-m47st_bf00f4f5-2086-46a3-b460-f55dd00e2507/kube-rbac-proxy/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.266684 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-m47st_bf00f4f5-2086-46a3-b460-f55dd00e2507/controller/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.336646 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.535784 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.535926 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.558185 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.581725 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.736891 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.742063 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.764533 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:56:28 crc kubenswrapper[4902]: I1009 14:56:28.847968 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.490309 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/controller/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.508079 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-reloader/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.524966 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-frr-files/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.540838 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/cp-metrics/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.712478 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/frr-metrics/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.721703 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/kube-rbac-proxy/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.794457 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/kube-rbac-proxy-frr/0.log" Oct 09 14:56:29 crc kubenswrapper[4902]: I1009 14:56:29.943070 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/reloader/0.log" Oct 09 14:56:30 crc kubenswrapper[4902]: I1009 14:56:30.033507 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-v4t7k_c788e9d2-cc9c-4dd8-b65d-f422358e0510/frr-k8s-webhook-server/0.log" Oct 09 14:56:30 crc kubenswrapper[4902]: I1009 14:56:30.335082 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-69f9c58987-qjtns_04adaa94-05f3-4989-b5fa-a057f556aa56/manager/0.log" Oct 09 14:56:30 crc kubenswrapper[4902]: I1009 14:56:30.487488 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56b4cd547-vqwzj_17bd8034-bc7c-4eaa-9f47-74ca097940bd/webhook-server/0.log" Oct 09 14:56:30 crc kubenswrapper[4902]: I1009 14:56:30.596509 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pbqrp_b400d066-a3bb-4b85-aaa1-7ddca808de2e/kube-rbac-proxy/0.log" Oct 09 14:56:31 crc kubenswrapper[4902]: I1009 14:56:31.083279 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sw9n6_693846cb-0606-4818-b246-e6940fa26802/frr/0.log" Oct 09 14:56:31 crc kubenswrapper[4902]: I1009 14:56:31.140363 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pbqrp_b400d066-a3bb-4b85-aaa1-7ddca808de2e/speaker/0.log" Oct 09 14:56:41 crc kubenswrapper[4902]: I1009 14:56:41.932130 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.107459 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.116308 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.116486 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.271349 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/util/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.296991 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/pull/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.297125 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2qdm9l_13b857de-39a5-412b-b9a6-bd26a961d189/extract/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.447247 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.625571 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.627911 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.640474 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.800645 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-content/0.log" Oct 09 14:56:42 crc kubenswrapper[4902]: I1009 14:56:42.803215 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/extract-utilities/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.018363 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.266166 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.328281 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.354864 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.391775 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hpxl7_0aa7377e-9f5a-411d-a20d-a134a5735eda/registry-server/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.518960 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-utilities/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.519058 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/extract-content/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.698153 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:56:43 crc kubenswrapper[4902]: I1009 14:56:43.964650 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.026949 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.054124 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.212858 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-j2rjw_13822386-0dae-4414-b5e3-f2bc758f6948/registry-server/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.290928 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/util/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.293588 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/pull/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.316590 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cj5c5n_3b4fc06f-f461-4486-83e7-4d153cc9ef10/extract/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.457241 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bb67d_3b0cc6d4-31a9-4f2f-90ab-2cb6676e61b6/marketplace-operator/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.565722 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.695378 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.697017 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.703919 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.925567 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-utilities/0.log" Oct 09 14:56:44 crc kubenswrapper[4902]: I1009 14:56:44.929062 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/extract-content/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.145906 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mq2n8_3f34ca09-a125-4ad0-a8ea-fe6c0791bb5e/registry-server/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.156697 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.312649 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.319602 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.342632 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.561563 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-content/0.log" Oct 09 14:56:45 crc kubenswrapper[4902]: I1009 14:56:45.561591 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/extract-utilities/0.log" Oct 09 14:56:46 crc kubenswrapper[4902]: I1009 14:56:46.083678 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z5k86_1d982b85-de80-4c77-82fc-8c4622cbd203/registry-server/0.log" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.761678 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:57:55 crc kubenswrapper[4902]: E1009 14:57:55.766578 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18caf89-138a-4b99-986f-32e57e94bb90" containerName="container-00" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.766865 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18caf89-138a-4b99-986f-32e57e94bb90" containerName="container-00" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.767158 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18caf89-138a-4b99-986f-32e57e94bb90" containerName="container-00" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.768635 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.776347 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.930075 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2994\" (UniqueName: \"kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.930159 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:55 crc kubenswrapper[4902]: I1009 14:57:55.930331 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.032266 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2994\" (UniqueName: \"kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.032336 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.032399 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.032935 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.033091 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.620510 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2994\" (UniqueName: \"kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994\") pod \"redhat-operators-nd77w\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:56 crc kubenswrapper[4902]: I1009 14:57:56.686186 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:57:57 crc kubenswrapper[4902]: I1009 14:57:57.171957 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:57:57 crc kubenswrapper[4902]: I1009 14:57:57.984064 4902 generic.go:334] "Generic (PLEG): container finished" podID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerID="9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff" exitCode=0 Oct 09 14:57:57 crc kubenswrapper[4902]: I1009 14:57:57.984118 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerDied","Data":"9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff"} Oct 09 14:57:57 crc kubenswrapper[4902]: I1009 14:57:57.984536 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerStarted","Data":"65d1b3996a92aa97b10cf9c8e96550fd96c235958417746da8a57e702e966b5e"} Oct 09 14:57:59 crc kubenswrapper[4902]: I1009 14:57:59.001012 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerStarted","Data":"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65"} Oct 09 14:58:00 crc kubenswrapper[4902]: I1009 14:58:00.014925 4902 generic.go:334] "Generic (PLEG): container finished" podID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerID="fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65" exitCode=0 Oct 09 14:58:00 crc kubenswrapper[4902]: I1009 14:58:00.015112 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerDied","Data":"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65"} Oct 09 14:58:01 crc kubenswrapper[4902]: I1009 14:58:01.025694 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerStarted","Data":"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2"} Oct 09 14:58:01 crc kubenswrapper[4902]: I1009 14:58:01.046658 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nd77w" podStartSLOduration=3.417845392 podStartE2EDuration="6.046638632s" podCreationTimestamp="2025-10-09 14:57:55 +0000 UTC" firstStartedPulling="2025-10-09 14:57:57.986322234 +0000 UTC m=+4025.184181298" lastFinishedPulling="2025-10-09 14:58:00.615115474 +0000 UTC m=+4027.812974538" observedRunningTime="2025-10-09 14:58:01.040560707 +0000 UTC m=+4028.238419771" watchObservedRunningTime="2025-10-09 14:58:01.046638632 +0000 UTC m=+4028.244497696" Oct 09 14:58:06 crc kubenswrapper[4902]: I1009 14:58:06.687670 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:06 crc kubenswrapper[4902]: I1009 14:58:06.688293 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:06 crc kubenswrapper[4902]: I1009 14:58:06.735360 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:07 crc kubenswrapper[4902]: I1009 14:58:07.555093 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:07 crc kubenswrapper[4902]: I1009 14:58:07.608023 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.091811 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nd77w" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="registry-server" containerID="cri-o://4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2" gracePeriod=2 Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.600972 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.698755 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities\") pod \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.698943 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2994\" (UniqueName: \"kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994\") pod \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.699001 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content\") pod \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\" (UID: \"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2\") " Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.699916 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities" (OuterVolumeSpecName: "utilities") pod "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" (UID: "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.704323 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994" (OuterVolumeSpecName: "kube-api-access-s2994") pod "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" (UID: "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2"). InnerVolumeSpecName "kube-api-access-s2994". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.802873 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2994\" (UniqueName: \"kubernetes.io/projected/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-kube-api-access-s2994\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:09 crc kubenswrapper[4902]: I1009 14:58:09.802912 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.103252 4902 generic.go:334] "Generic (PLEG): container finished" podID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerID="4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2" exitCode=0 Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.103338 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerDied","Data":"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2"} Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.103343 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd77w" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.103393 4902 scope.go:117] "RemoveContainer" containerID="4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.103379 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd77w" event={"ID":"47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2","Type":"ContainerDied","Data":"65d1b3996a92aa97b10cf9c8e96550fd96c235958417746da8a57e702e966b5e"} Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.128231 4902 scope.go:117] "RemoveContainer" containerID="fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.149073 4902 scope.go:117] "RemoveContainer" containerID="9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.196236 4902 scope.go:117] "RemoveContainer" containerID="4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2" Oct 09 14:58:10 crc kubenswrapper[4902]: E1009 14:58:10.196697 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2\": container with ID starting with 4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2 not found: ID does not exist" containerID="4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.196730 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2"} err="failed to get container status \"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2\": rpc error: code = NotFound desc = could not find container \"4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2\": container with ID starting with 4c8845b3f5cb2bbe53dd9cdf335d6b5a03f7c7dcbe38258c3f2d5affa1d595c2 not found: ID does not exist" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.196756 4902 scope.go:117] "RemoveContainer" containerID="fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65" Oct 09 14:58:10 crc kubenswrapper[4902]: E1009 14:58:10.197054 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65\": container with ID starting with fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65 not found: ID does not exist" containerID="fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.197083 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65"} err="failed to get container status \"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65\": rpc error: code = NotFound desc = could not find container \"fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65\": container with ID starting with fe633d1bfce24ccfebdd8aa8d961d1c57d6e36ef5de8208d6b7ee1852a7d5d65 not found: ID does not exist" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.197101 4902 scope.go:117] "RemoveContainer" containerID="9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff" Oct 09 14:58:10 crc kubenswrapper[4902]: E1009 14:58:10.197517 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff\": container with ID starting with 9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff not found: ID does not exist" containerID="9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.197539 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff"} err="failed to get container status \"9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff\": rpc error: code = NotFound desc = could not find container \"9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff\": container with ID starting with 9e81935b7e29afc8cc4d6dffdf119fcc036d828079ba7d2f403eb5c0975a72ff not found: ID does not exist" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.407730 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" (UID: "47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.414229 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.741088 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:58:10 crc kubenswrapper[4902]: I1009 14:58:10.750187 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nd77w"] Oct 09 14:58:11 crc kubenswrapper[4902]: I1009 14:58:11.525841 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" path="/var/lib/kubelet/pods/47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2/volumes" Oct 09 14:58:20 crc kubenswrapper[4902]: I1009 14:58:20.078282 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:58:20 crc kubenswrapper[4902]: I1009 14:58:20.078954 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:58:28 crc kubenswrapper[4902]: I1009 14:58:28.282206 4902 generic.go:334] "Generic (PLEG): container finished" podID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerID="64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18" exitCode=0 Oct 09 14:58:28 crc kubenswrapper[4902]: I1009 14:58:28.282300 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-glgsr/must-gather-pr2cj" event={"ID":"09f92dfe-37ba-4c41-afbd-15a01e30b414","Type":"ContainerDied","Data":"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18"} Oct 09 14:58:28 crc kubenswrapper[4902]: I1009 14:58:28.283318 4902 scope.go:117] "RemoveContainer" containerID="64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18" Oct 09 14:58:28 crc kubenswrapper[4902]: I1009 14:58:28.826383 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgsr_must-gather-pr2cj_09f92dfe-37ba-4c41-afbd-15a01e30b414/gather/0.log" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.722890 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-glgsr/must-gather-pr2cj"] Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.723726 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-glgsr/must-gather-pr2cj" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="copy" containerID="cri-o://c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07" gracePeriod=2 Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.733766 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-glgsr/must-gather-pr2cj"] Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813200 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:37 crc kubenswrapper[4902]: E1009 14:58:37.813652 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="gather" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813672 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="gather" Oct 09 14:58:37 crc kubenswrapper[4902]: E1009 14:58:37.813691 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="extract-utilities" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813701 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="extract-utilities" Oct 09 14:58:37 crc kubenswrapper[4902]: E1009 14:58:37.813715 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="extract-content" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813723 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="extract-content" Oct 09 14:58:37 crc kubenswrapper[4902]: E1009 14:58:37.813734 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="copy" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813741 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="copy" Oct 09 14:58:37 crc kubenswrapper[4902]: E1009 14:58:37.813780 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="registry-server" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.813790 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="registry-server" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.814022 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="gather" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.814048 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="47bc6fe6-37e1-4c9b-9e23-4b72a47e72c2" containerName="registry-server" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.814075 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerName="copy" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.815776 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.825666 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.893265 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.893458 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm8tr\" (UniqueName: \"kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.893503 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.996274 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm8tr\" (UniqueName: \"kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.996616 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.996658 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.997199 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:37 crc kubenswrapper[4902]: I1009 14:58:37.997761 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.020564 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm8tr\" (UniqueName: \"kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr\") pod \"community-operators-vbj5t\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.183251 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.196722 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgsr_must-gather-pr2cj_09f92dfe-37ba-4c41-afbd-15a01e30b414/copy/0.log" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.204810 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.302042 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output\") pod \"09f92dfe-37ba-4c41-afbd-15a01e30b414\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.302311 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkc7m\" (UniqueName: \"kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m\") pod \"09f92dfe-37ba-4c41-afbd-15a01e30b414\" (UID: \"09f92dfe-37ba-4c41-afbd-15a01e30b414\") " Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.308988 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m" (OuterVolumeSpecName: "kube-api-access-bkc7m") pod "09f92dfe-37ba-4c41-afbd-15a01e30b414" (UID: "09f92dfe-37ba-4c41-afbd-15a01e30b414"). InnerVolumeSpecName "kube-api-access-bkc7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.407033 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkc7m\" (UniqueName: \"kubernetes.io/projected/09f92dfe-37ba-4c41-afbd-15a01e30b414-kube-api-access-bkc7m\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.410948 4902 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-glgsr_must-gather-pr2cj_09f92dfe-37ba-4c41-afbd-15a01e30b414/copy/0.log" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.411452 4902 generic.go:334] "Generic (PLEG): container finished" podID="09f92dfe-37ba-4c41-afbd-15a01e30b414" containerID="c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07" exitCode=143 Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.411512 4902 scope.go:117] "RemoveContainer" containerID="c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.411673 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-glgsr/must-gather-pr2cj" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.438945 4902 scope.go:117] "RemoveContainer" containerID="64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.505732 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09f92dfe-37ba-4c41-afbd-15a01e30b414" (UID: "09f92dfe-37ba-4c41-afbd-15a01e30b414"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:58:38 crc kubenswrapper[4902]: I1009 14:58:38.508602 4902 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09f92dfe-37ba-4c41-afbd-15a01e30b414-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.011588 4902 scope.go:117] "RemoveContainer" containerID="c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07" Oct 09 14:58:39 crc kubenswrapper[4902]: E1009 14:58:39.013524 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07\": container with ID starting with c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07 not found: ID does not exist" containerID="c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07" Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.013614 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07"} err="failed to get container status \"c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07\": rpc error: code = NotFound desc = could not find container \"c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07\": container with ID starting with c43e0c00d2325a349736d5c03c107bb92f07018b051e9cce55f74827d53f9c07 not found: ID does not exist" Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.013713 4902 scope.go:117] "RemoveContainer" containerID="64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18" Oct 09 14:58:39 crc kubenswrapper[4902]: E1009 14:58:39.014215 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18\": container with ID starting with 64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18 not found: ID does not exist" containerID="64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18" Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.014281 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18"} err="failed to get container status \"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18\": rpc error: code = NotFound desc = could not find container \"64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18\": container with ID starting with 64ebfd322d525ff06037c709a5ed29808ac707a505184b4f35dd454688207f18 not found: ID does not exist" Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.193247 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.425012 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerStarted","Data":"7fff7afaae29fc3bb4d15a3ab41102f94681d6587fd9ac7361bdc60bc6806a53"} Oct 09 14:58:39 crc kubenswrapper[4902]: I1009 14:58:39.523633 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f92dfe-37ba-4c41-afbd-15a01e30b414" path="/var/lib/kubelet/pods/09f92dfe-37ba-4c41-afbd-15a01e30b414/volumes" Oct 09 14:58:40 crc kubenswrapper[4902]: I1009 14:58:40.435953 4902 generic.go:334] "Generic (PLEG): container finished" podID="63132d66-e30a-4378-a6b5-7435604b4d19" containerID="dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e" exitCode=0 Oct 09 14:58:40 crc kubenswrapper[4902]: I1009 14:58:40.436003 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerDied","Data":"dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e"} Oct 09 14:58:40 crc kubenswrapper[4902]: I1009 14:58:40.438283 4902 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 14:58:41 crc kubenswrapper[4902]: I1009 14:58:41.454519 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerStarted","Data":"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d"} Oct 09 14:58:42 crc kubenswrapper[4902]: I1009 14:58:42.468541 4902 generic.go:334] "Generic (PLEG): container finished" podID="63132d66-e30a-4378-a6b5-7435604b4d19" containerID="4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d" exitCode=0 Oct 09 14:58:42 crc kubenswrapper[4902]: I1009 14:58:42.468602 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerDied","Data":"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d"} Oct 09 14:58:43 crc kubenswrapper[4902]: I1009 14:58:43.480208 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerStarted","Data":"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8"} Oct 09 14:58:43 crc kubenswrapper[4902]: I1009 14:58:43.500330 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbj5t" podStartSLOduration=3.914039816 podStartE2EDuration="6.500307182s" podCreationTimestamp="2025-10-09 14:58:37 +0000 UTC" firstStartedPulling="2025-10-09 14:58:40.437921754 +0000 UTC m=+4067.635780818" lastFinishedPulling="2025-10-09 14:58:43.02418912 +0000 UTC m=+4070.222048184" observedRunningTime="2025-10-09 14:58:43.494797683 +0000 UTC m=+4070.692656757" watchObservedRunningTime="2025-10-09 14:58:43.500307182 +0000 UTC m=+4070.698166256" Oct 09 14:58:48 crc kubenswrapper[4902]: I1009 14:58:48.183934 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:48 crc kubenswrapper[4902]: I1009 14:58:48.184644 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:48 crc kubenswrapper[4902]: I1009 14:58:48.230232 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:48 crc kubenswrapper[4902]: I1009 14:58:48.568453 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:48 crc kubenswrapper[4902]: I1009 14:58:48.616262 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:50 crc kubenswrapper[4902]: I1009 14:58:50.077832 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:58:50 crc kubenswrapper[4902]: I1009 14:58:50.077890 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:58:50 crc kubenswrapper[4902]: I1009 14:58:50.541940 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbj5t" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="registry-server" containerID="cri-o://cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8" gracePeriod=2 Oct 09 14:58:50 crc kubenswrapper[4902]: I1009 14:58:50.977680 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.074976 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities\") pod \"63132d66-e30a-4378-a6b5-7435604b4d19\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.075027 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm8tr\" (UniqueName: \"kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr\") pod \"63132d66-e30a-4378-a6b5-7435604b4d19\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.075064 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content\") pod \"63132d66-e30a-4378-a6b5-7435604b4d19\" (UID: \"63132d66-e30a-4378-a6b5-7435604b4d19\") " Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.075977 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities" (OuterVolumeSpecName: "utilities") pod "63132d66-e30a-4378-a6b5-7435604b4d19" (UID: "63132d66-e30a-4378-a6b5-7435604b4d19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.080479 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr" (OuterVolumeSpecName: "kube-api-access-qm8tr") pod "63132d66-e30a-4378-a6b5-7435604b4d19" (UID: "63132d66-e30a-4378-a6b5-7435604b4d19"). InnerVolumeSpecName "kube-api-access-qm8tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.129701 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63132d66-e30a-4378-a6b5-7435604b4d19" (UID: "63132d66-e30a-4378-a6b5-7435604b4d19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.177387 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm8tr\" (UniqueName: \"kubernetes.io/projected/63132d66-e30a-4378-a6b5-7435604b4d19-kube-api-access-qm8tr\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.177521 4902 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.177537 4902 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63132d66-e30a-4378-a6b5-7435604b4d19-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.552456 4902 generic.go:334] "Generic (PLEG): container finished" podID="63132d66-e30a-4378-a6b5-7435604b4d19" containerID="cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8" exitCode=0 Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.552504 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerDied","Data":"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8"} Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.552535 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbj5t" event={"ID":"63132d66-e30a-4378-a6b5-7435604b4d19","Type":"ContainerDied","Data":"7fff7afaae29fc3bb4d15a3ab41102f94681d6587fd9ac7361bdc60bc6806a53"} Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.552555 4902 scope.go:117] "RemoveContainer" containerID="cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.552710 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbj5t" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.581843 4902 scope.go:117] "RemoveContainer" containerID="4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.584027 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.593090 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbj5t"] Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.620776 4902 scope.go:117] "RemoveContainer" containerID="dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.667559 4902 scope.go:117] "RemoveContainer" containerID="cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8" Oct 09 14:58:51 crc kubenswrapper[4902]: E1009 14:58:51.667990 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8\": container with ID starting with cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8 not found: ID does not exist" containerID="cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.668145 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8"} err="failed to get container status \"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8\": rpc error: code = NotFound desc = could not find container \"cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8\": container with ID starting with cca73ade2ea4bd3a993e4d3f8b9a3f831a2e587671c07f58649d9e9862f8fda8 not found: ID does not exist" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.668274 4902 scope.go:117] "RemoveContainer" containerID="4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d" Oct 09 14:58:51 crc kubenswrapper[4902]: E1009 14:58:51.668793 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d\": container with ID starting with 4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d not found: ID does not exist" containerID="4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.668839 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d"} err="failed to get container status \"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d\": rpc error: code = NotFound desc = could not find container \"4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d\": container with ID starting with 4f5406b3f6c88ef38b45874dc87f174cc208fb4146fdd657f5fc76c60ea18b6d not found: ID does not exist" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.668867 4902 scope.go:117] "RemoveContainer" containerID="dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e" Oct 09 14:58:51 crc kubenswrapper[4902]: E1009 14:58:51.670378 4902 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e\": container with ID starting with dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e not found: ID does not exist" containerID="dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e" Oct 09 14:58:51 crc kubenswrapper[4902]: I1009 14:58:51.670400 4902 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e"} err="failed to get container status \"dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e\": rpc error: code = NotFound desc = could not find container \"dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e\": container with ID starting with dfb53d5cb021d15e9e90515a3a5bdd94cf120e4320683a5c6cac4793842e351e not found: ID does not exist" Oct 09 14:58:51 crc kubenswrapper[4902]: E1009 14:58:51.690670 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63132d66_e30a_4378_a6b5_7435604b4d19.slice/crio-7fff7afaae29fc3bb4d15a3ab41102f94681d6587fd9ac7361bdc60bc6806a53\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63132d66_e30a_4378_a6b5_7435604b4d19.slice\": RecentStats: unable to find data in memory cache]" Oct 09 14:58:53 crc kubenswrapper[4902]: I1009 14:58:53.529077 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" path="/var/lib/kubelet/pods/63132d66-e30a-4378-a6b5-7435604b4d19/volumes" Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.078391 4902 patch_prober.go:28] interesting pod/machine-config-daemon-gbt7s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.078909 4902 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.078952 4902 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.079967 4902 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbab6b9066d83619bdeb19ac785cb63ac0c1e5fe99b941a6fef8c2cd526a174d"} pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.080041 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" podUID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerName="machine-config-daemon" containerID="cri-o://dbab6b9066d83619bdeb19ac785cb63ac0c1e5fe99b941a6fef8c2cd526a174d" gracePeriod=600 Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.864108 4902 generic.go:334] "Generic (PLEG): container finished" podID="6cfbac91-e798-4e5e-9f3c-f454ea6f457e" containerID="dbab6b9066d83619bdeb19ac785cb63ac0c1e5fe99b941a6fef8c2cd526a174d" exitCode=0 Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.864199 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerDied","Data":"dbab6b9066d83619bdeb19ac785cb63ac0c1e5fe99b941a6fef8c2cd526a174d"} Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.864486 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gbt7s" event={"ID":"6cfbac91-e798-4e5e-9f3c-f454ea6f457e","Type":"ContainerStarted","Data":"6f07783542730c4d14e82382a9a6c30af6c7a2b17dd8c9c5029f3d18bd815faf"} Oct 09 14:59:20 crc kubenswrapper[4902]: I1009 14:59:20.864524 4902 scope.go:117] "RemoveContainer" containerID="3c92344b7d9241fd1bd747690a712747aa6c548ba1f4b8118607a636810a1726" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.150941 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s"] Oct 09 15:00:00 crc kubenswrapper[4902]: E1009 15:00:00.152083 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="extract-utilities" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.152099 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="extract-utilities" Oct 09 15:00:00 crc kubenswrapper[4902]: E1009 15:00:00.152124 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="extract-content" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.152132 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="extract-content" Oct 09 15:00:00 crc kubenswrapper[4902]: E1009 15:00:00.152172 4902 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="registry-server" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.152181 4902 state_mem.go:107] "Deleted CPUSet assignment" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="registry-server" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.152436 4902 memory_manager.go:354] "RemoveStaleState removing state" podUID="63132d66-e30a-4378-a6b5-7435604b4d19" containerName="registry-server" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.153339 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.156106 4902 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.157605 4902 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.169220 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s"] Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.226296 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66dl\" (UniqueName: \"kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.226367 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.226596 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.328935 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.329048 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66dl\" (UniqueName: \"kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.329072 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.330102 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.715599 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.716068 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66dl\" (UniqueName: \"kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl\") pod \"collect-profiles-29333700-8wl8s\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:00 crc kubenswrapper[4902]: I1009 15:00:00.783363 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.257815 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s"] Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.600578 4902 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p8nbw"] Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.604797 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.646387 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8nbw"] Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.655642 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-utilities\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.655997 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-catalog-content\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.656141 4902 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-429r7\" (UniqueName: \"kubernetes.io/projected/d5feb821-f608-49da-b439-eeb2787322cd-kube-api-access-429r7\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.757272 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-429r7\" (UniqueName: \"kubernetes.io/projected/d5feb821-f608-49da-b439-eeb2787322cd-kube-api-access-429r7\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.757394 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-utilities\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.757525 4902 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-catalog-content\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.758072 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-catalog-content\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.758491 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5feb821-f608-49da-b439-eeb2787322cd-utilities\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.780257 4902 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-429r7\" (UniqueName: \"kubernetes.io/projected/d5feb821-f608-49da-b439-eeb2787322cd-kube-api-access-429r7\") pod \"redhat-marketplace-p8nbw\" (UID: \"d5feb821-f608-49da-b439-eeb2787322cd\") " pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:01 crc kubenswrapper[4902]: I1009 15:00:01.946280 4902 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:02 crc kubenswrapper[4902]: I1009 15:00:02.226197 4902 generic.go:334] "Generic (PLEG): container finished" podID="b77b1d6e-6c80-4527-8025-9e3bdda32703" containerID="0a2cff9b9a60c0f7c4efa8508af0850d8d840e3e454906714961de32f07acce4" exitCode=0 Oct 09 15:00:02 crc kubenswrapper[4902]: I1009 15:00:02.226326 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" event={"ID":"b77b1d6e-6c80-4527-8025-9e3bdda32703","Type":"ContainerDied","Data":"0a2cff9b9a60c0f7c4efa8508af0850d8d840e3e454906714961de32f07acce4"} Oct 09 15:00:02 crc kubenswrapper[4902]: I1009 15:00:02.226741 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" event={"ID":"b77b1d6e-6c80-4527-8025-9e3bdda32703","Type":"ContainerStarted","Data":"d5bad4d1ca7f5e46b524317d84ce2c4179c7b81eb53ee56b1462817e9e37ab0e"} Oct 09 15:00:02 crc kubenswrapper[4902]: I1009 15:00:02.413700 4902 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8nbw"] Oct 09 15:00:02 crc kubenswrapper[4902]: W1009 15:00:02.918933 4902 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5feb821_f608_49da_b439_eeb2787322cd.slice/crio-109a25357cddc1e66d81faa4a4bb5db2ee93a8dcef872ccd2e97e13157158198 WatchSource:0}: Error finding container 109a25357cddc1e66d81faa4a4bb5db2ee93a8dcef872ccd2e97e13157158198: Status 404 returned error can't find the container with id 109a25357cddc1e66d81faa4a4bb5db2ee93a8dcef872ccd2e97e13157158198 Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.247295 4902 generic.go:334] "Generic (PLEG): container finished" podID="d5feb821-f608-49da-b439-eeb2787322cd" containerID="1eb5830a82a360157224f1571f71724e4d4966aa79bd18b710db2d348fc940dc" exitCode=0 Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.247340 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8nbw" event={"ID":"d5feb821-f608-49da-b439-eeb2787322cd","Type":"ContainerDied","Data":"1eb5830a82a360157224f1571f71724e4d4966aa79bd18b710db2d348fc940dc"} Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.247650 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8nbw" event={"ID":"d5feb821-f608-49da-b439-eeb2787322cd","Type":"ContainerStarted","Data":"109a25357cddc1e66d81faa4a4bb5db2ee93a8dcef872ccd2e97e13157158198"} Oct 09 15:00:03 crc kubenswrapper[4902]: E1009 15:00:03.412356 4902 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5feb821_f608_49da_b439_eeb2787322cd.slice/crio-1eb5830a82a360157224f1571f71724e4d4966aa79bd18b710db2d348fc940dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5feb821_f608_49da_b439_eeb2787322cd.slice/crio-conmon-1eb5830a82a360157224f1571f71724e4d4966aa79bd18b710db2d348fc940dc.scope\": RecentStats: unable to find data in memory cache]" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.597502 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.800764 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume\") pod \"b77b1d6e-6c80-4527-8025-9e3bdda32703\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.800846 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f66dl\" (UniqueName: \"kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl\") pod \"b77b1d6e-6c80-4527-8025-9e3bdda32703\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.800936 4902 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume\") pod \"b77b1d6e-6c80-4527-8025-9e3bdda32703\" (UID: \"b77b1d6e-6c80-4527-8025-9e3bdda32703\") " Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.801955 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume" (OuterVolumeSpecName: "config-volume") pod "b77b1d6e-6c80-4527-8025-9e3bdda32703" (UID: "b77b1d6e-6c80-4527-8025-9e3bdda32703"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.806573 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b77b1d6e-6c80-4527-8025-9e3bdda32703" (UID: "b77b1d6e-6c80-4527-8025-9e3bdda32703"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.809516 4902 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl" (OuterVolumeSpecName: "kube-api-access-f66dl") pod "b77b1d6e-6c80-4527-8025-9e3bdda32703" (UID: "b77b1d6e-6c80-4527-8025-9e3bdda32703"). InnerVolumeSpecName "kube-api-access-f66dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.904050 4902 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b77b1d6e-6c80-4527-8025-9e3bdda32703-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.904691 4902 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f66dl\" (UniqueName: \"kubernetes.io/projected/b77b1d6e-6c80-4527-8025-9e3bdda32703-kube-api-access-f66dl\") on node \"crc\" DevicePath \"\"" Oct 09 15:00:03 crc kubenswrapper[4902]: I1009 15:00:03.904770 4902 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b77b1d6e-6c80-4527-8025-9e3bdda32703-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 15:00:04 crc kubenswrapper[4902]: I1009 15:00:04.257245 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" Oct 09 15:00:04 crc kubenswrapper[4902]: I1009 15:00:04.257239 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333700-8wl8s" event={"ID":"b77b1d6e-6c80-4527-8025-9e3bdda32703","Type":"ContainerDied","Data":"d5bad4d1ca7f5e46b524317d84ce2c4179c7b81eb53ee56b1462817e9e37ab0e"} Oct 09 15:00:04 crc kubenswrapper[4902]: I1009 15:00:04.257596 4902 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5bad4d1ca7f5e46b524317d84ce2c4179c7b81eb53ee56b1462817e9e37ab0e" Oct 09 15:00:04 crc kubenswrapper[4902]: I1009 15:00:04.681914 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5"] Oct 09 15:00:04 crc kubenswrapper[4902]: I1009 15:00:04.689166 4902 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333655-vq8n5"] Oct 09 15:00:05 crc kubenswrapper[4902]: I1009 15:00:05.523587 4902 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449abebe-b0c0-4f13-b153-05a82a45398b" path="/var/lib/kubelet/pods/449abebe-b0c0-4f13-b153-05a82a45398b/volumes" Oct 09 15:00:05 crc kubenswrapper[4902]: I1009 15:00:05.769178 4902 scope.go:117] "RemoveContainer" containerID="76a653fbdf5769aa7555e9c2c5f57c5e4bc8507fc9c4e73ceae012a1d7003388" Oct 09 15:00:05 crc kubenswrapper[4902]: I1009 15:00:05.795217 4902 scope.go:117] "RemoveContainer" containerID="bf591366148f679e6374f08499c3a4597c8904fb029ecbc6007a72cb4f8ae723" Oct 09 15:00:06 crc kubenswrapper[4902]: I1009 15:00:06.277393 4902 generic.go:334] "Generic (PLEG): container finished" podID="d5feb821-f608-49da-b439-eeb2787322cd" containerID="eb8468f152397c330425d1f4b30870beeed9692406c7a1d2ceefd6c3d93bd68a" exitCode=0 Oct 09 15:00:06 crc kubenswrapper[4902]: I1009 15:00:06.277709 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8nbw" event={"ID":"d5feb821-f608-49da-b439-eeb2787322cd","Type":"ContainerDied","Data":"eb8468f152397c330425d1f4b30870beeed9692406c7a1d2ceefd6c3d93bd68a"} Oct 09 15:00:08 crc kubenswrapper[4902]: I1009 15:00:08.297315 4902 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p8nbw" event={"ID":"d5feb821-f608-49da-b439-eeb2787322cd","Type":"ContainerStarted","Data":"9b5ee6dfe1e8e42523947ea0ae7b399a68e51e2fed98310b72562f13d7d2c576"} Oct 09 15:00:08 crc kubenswrapper[4902]: I1009 15:00:08.317105 4902 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p8nbw" podStartSLOduration=4.035905538 podStartE2EDuration="7.317086161s" podCreationTimestamp="2025-10-09 15:00:01 +0000 UTC" firstStartedPulling="2025-10-09 15:00:04.259442643 +0000 UTC m=+4151.457301717" lastFinishedPulling="2025-10-09 15:00:07.540623276 +0000 UTC m=+4154.738482340" observedRunningTime="2025-10-09 15:00:08.315032642 +0000 UTC m=+4155.512891706" watchObservedRunningTime="2025-10-09 15:00:08.317086161 +0000 UTC m=+4155.514945225" Oct 09 15:00:11 crc kubenswrapper[4902]: I1009 15:00:11.947685 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:11 crc kubenswrapper[4902]: I1009 15:00:11.948299 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:12 crc kubenswrapper[4902]: I1009 15:00:12.001481 4902 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:12 crc kubenswrapper[4902]: I1009 15:00:12.373402 4902 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p8nbw" Oct 09 15:00:12 crc kubenswrapper[4902]: I1009 15:00:12.424774 4902 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p8nbw"] Oct 09 15:00:14 crc kubenswrapper[4902]: I1009 15:00:14.355076 4902 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p8nbw" podUID="d5feb821-f608-49da-b439-eeb2787322cd" containerName="registry-server" containerID="cri-o://9b5ee6dfe1e8e42523947ea0ae7b399a68e51e2fed98310b72562f13d7d2c576" gracePeriod=2 Oct 09 15:00:14 crc kubenswrapper[4902]: I1009 15:00:14.865480 4902 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p8nbw"